No more ‘army of one’ with Big Data accountability revolution | #MITIQ

091013-A-7540H-016At this week’s MIT CDOIQ Symposium, held in Cambridge, Massachusetts, you wouldn’t have been faulted for recognizing that two industries that have embraced the Big Data revolution are healthcare and the financial sector. The benefits of employing analytics models to these fields are more and more apparent. You might be surprised, however, to learn of one particular public sector player that is seeking an operational advantage via Big Data: the Department of Defense.

Mort Anvari, Director of Programs and Strategy within the Deputy Assistant Secretary of the Army’s Cost and Economics division, has overseen the creation and implementation of directives aimed at the cost culture and cost management of the US Army.

“Private industry is easier because everyone is cost conscious,” Anvari explains. “In government, particularly during a war, that mission is driving everything. Most of our officers are only concerned with how much money they need and how they spend it.” Anvari admits that his task of directing commanders to accede to a cost culture mindset was a pretty big deal. “We had to look at it from the people’s perspective. That includes convincing leadership that attention to cost doesn’t create a bad image for the country.” The main argument centered on the perception that being cost conscious could appear to be putting our servicemembers unnecessarily into greater harm’s way.

 

Surviving the Culture Clash

Anvari claims an early success with the education of commanders and other officers that paying special attention to cost and soldier safety were not mutually exclusive. “You can be cost conscious. You can do more with your resources. You can be more efficient with it and still care about safety and care about the soldiers.”

With projects budgeted above $10 million, an automatic cost/benefit analysis is enacted. Anvari oversees more than 2,000 Army analysts who perform and validate each of these. Knowing the process, commanders will typically consider an additional course of action to what they submit for review. “Talking to commanders and leadership, we ask ‘What is your information need?’,” he stated. “Based on that, we develop a data need for that organization.”

Similar to challenges faced in the private sector, one of the issues overcome by the Army was convincing certain organizations that possessed data to share that property with other organizations. “Communicating the data need from organization A to organization B, telling organization B you need to provide this data that is not for you, it’s for someone else,” Anvari said, “that was a big culture shock.”

However, in explaining the funding structure for the military like an upside down tree, Anvari was able to bring an understanding that all funding came from the top and spread out to all of the “branches” within the Army. “We call it fund centers. They have all the money. The cost centers are the ones using this money. It could be them or it could be others. It’s truly like a neural network of information.”

Unlike private sector companies, the Army realized they had to streamline their budgeting and allocation process due to the fact they are subjected to strict oversight by the Congress. As all of the funding is taxpayer monies, citizens are also privy to the budgeting process through the use of Freedom of Information Act requests.

As noted above, the cost-benefit analysis process is automatic on projects in excess of $10 million. However, Anvari notes that projects under that threshold, undertaken by commanders of smaller outfits, are pretty well self-monitored because those commanders want to show that they are capable of critically applying the new cost culture analysis in the hopes they will be promoted to heading up larger projects in the future.

“Accountability is hard to swallow,” says Anvari, with regard to the early push back from Army leadership, “no matter how normalized the process is.” Anvari’s work is seeing results, however. “Cost management is on its feet and working,” he concluded.

(Originally published at SiliconANGLE.com)

photo credit: The U.S. Army via photopin cc
Advertisements

You may not need Big Data after all | #MITIQ

WarningThe business buzzword over the past two years has been “Big Data”. Companies are trying to figure our how they can leverage their collected data and translate it into a competitive advantage. However, according to the Director of MIT’s Sloan School Center for Information Systems Research, Jeanne Ross, this approach is not necessarily a one-size-fits-all for today’s organizations.

Ross, co-author of the article ‘You May Not Need Big Data After All’, cautions businesses against buying into the hype around Big Data.

“I think you grow into Big Data,” Ross notes. She explains that there are companies who find the competitive advantage works within their specific industries. As an example, she notes that the oil and gas industry has long employed Big Data for helping them to decide when and where they should place a billion dollar well. The success in one industry, however, doesn’t necessarily translate into success in others. “Many times we know great things about our customers. We just haven’t figured out a way to address them.”

When asked if the fear is misplaced that some companies feel in that they can’t address the Big Data they have, Ross states, “No, not misplaced at all. If you don’t think you can do it, you probably can’t.” For organizations recognizing the potential value of Big Data for the first time, this news could be disheartening.

“I don’t think most companies are data-driven,” explains Ross. “I think they are metric driven.”

This differentiation is important. Today’s companies can respond to certain kinds of data but in order to truly be a data-driven organization, they have to recognize which data is important. As an example, Ross cites Foxtel, a pay TV service based out of Australia.

“They saw what products were going out and what channels people wanted,” she states. Even with that information they were unable to make strategic decisions. “They went back and started looking at segments and realized what ‘data driven’ would be. They didn’t have the stomach to go back and do that.”

Where the CDO fits in

Discussing the emerging role of the CDO, Ross explained that too often there is a propensity to assume that once a CDO is brought into an organization all data issues will be addressed by that role and that little to no further attention is required. With Gartner projecting a 25 percent adoption of a CDO role in companies by next year, Ross claims most companies likely don’t need to create this position.

The key to running a successful organization is identifying and maintaining a single source of truth with respect to data. Many divisions within a company will manipulate data to show that they are running at a profit or contributing significantly to the organization’s bottom line. In the long run, this can be detrimental to the company because different data can show different outcomes.

Once companies adopt a single source of truth in their data, Ross believes it is of utmost importance that it is adopted in a top-down strategy. “We have to let people know mistakes have to be made. The faster you make mistakes, the more you can learn and the faster you can grow.” This strategy is ineffectual, however, if you start in the middle of the organization as people will be less willing to admit mistakes and failure if it hasn’t been adopted into the company’s cultural model.

The swiftly moving current of technology, especially over the previous five years, should be viewed critically by companies hoping to somehow gain a competitive advantage. Leveraging Big Data requires more than just a willingness to throw money at the problem. It requires a full understanding on the part of the company as a whole.

(Originally published at SiliconANGLE.com)

photo credit: Free Grunge Textures – www.freestock.ca via photopin cc

Exploring the emerging role of the CDO | #MITIQ

telescopeSince 2006, the Massachusetts Institute of Technology has hosted their Chief Data Officer and Information Quality symposium, highlighting the emerging role of data as a significant driver for revenue generation within an organization. The role of the CDO is the most recent iteration of the evolution of the data steward.

Writing for Wired, Bob Leaper of DST Global Solutions detailed that evolution from the early 80′s through to today. The Data Processing Manager gave way to the Chief Information Officer, placing an individual with computational know how into the boardroom. Even that ascendancy, he claims, still left a company’s data footprint and how best to utilize it somewhat in the shadows. The CDO’s role was created as a means of bridging the gap between the IT department and the company’s operations.

SiliconANGLE’s theCUBE, broadcasting now for the 2nd year from #MITIQ, will be detailing the responsibilities for this relatively new role as they slowly solidify. To that end, the live broadcast will feature interviews with academics, executives, thought leaders, experts and bloggers over the next two days.

In theCUBE’s kickoff for the event, co-host Jeff Kelly commented on how the role of the CDO is moving in the direction of being a more accepted position in the enterprise. In light of recent research by Gartner which claimed 25 percent of companies would have appointed a CDO by the end of next year, Kelly believes the role will be critically important for modern companies and that it will maintain that level of importance even a decade on from today.

With recent revelations of how some companies are utilizing their customer’s data, one newly important function of the role of the CDO will be in minimizing risk within the organization. While the goal of any company is to analyze their data in such a way that it drives revenue for the organization, the CDO will take on the added responsibility of ensuring all analysis is performed to the highest standard of ethics. This is because any perceived misuse or customer data and the backlash it creates will fall squarely on the shoulders of the company’s CDO.

Be sure to join Dave Vellante, Jeff Kelly and Paul Gillin all this week for the live broadcast of theCUBE at siliconangle.tv.

photo credit: Pierre Metivier via photopin cc

(Originally published at SiliconANGLE.com)

Predictive analytics stepping front & center in the business world | #ibmimpact

Crystal Ball BusinessmanThis week, SiliconANGLE’s theCUBE broadcast from both the ServiceNow Knowledge 2014 event at the Moscone Center in San Francisco and the IBM Impact Conference held at Las Vegas’ Venetian Resort and Casino. Helming theCUBE desk for IBM Impact were John Furrier and Paul Gillin. On Day 1, they welcomed the CEO and Principal Consultant for Decision Management Solutions, James Taylor.

Taylor’s biography explains that he is a 20-plus year veteran in the field of Decision Management and is regarded as a leading expert in decisioning technologies. He is also the author of Decision Management Systems: A Practical Guide to Using Business Rules and Predictive Analytics.

At the start of the conversation, Furrier noted how we are in an age that really should be considered one of the most dynamic times for IT. He cites a recognition by the top line of business focusing on utilizing IT for revenue and business growth and no longer employing their IT simply as a means of cost reduction. He attributes this to significant converging trends that are changing the role of IT, like increased speed and agility, unlimited compute in the Cloud and the fact everything is now instrumentable.

Interjecting on this thought, Taylor stated, “I think what has really changed is the acceptance of analytics. When I wrote the book, people were uncertain about it.” He continued, “There were ways to use small data that weren’t predictive analytics. If you don’t process it and turn it into a usable prediction, it’s hard to consume it.” As the landscape has changed, he believes predictive analytics is stepping to the front and center in the business world.

Watch the interview in its entirety here:

Much of that change is being driven by the habits and expectations of the emergence of a more technologically savvy consumer base. Talking about his own 25-year-old son having to purchase auto insurance, Taylor said, “When he gives you his data, he expects a quote. If you say you’ll let him know what the quote is, he’ll just go somewhere else. He wants the answer now.” This expectation requires real time responses even when the proposition is relatively complex and involves assessments of risk. “You have to use analytics to determine how risky he is and you have to answer now.” Taylor claims that capability has to be embedded not only into call center scripts but also into a company’s mobile applications and website where no human-to-human interaction ever occurs. “Because if you don’t, you’re missing the point,” he stated.

Understanding the Principles of Decision Management

Much of the rest of the conversation centered on Decision Management principles Taylor outlined in his book. The first such principle, ‘Begin With The Decision in Mind’ was recounted by Gillin. “That struck me as kind of obvious,” he said. “Isn’t that how you would go about this? Obviously there is a reason why you said that. Do you find that people typically don’t,” he asked.

“The reason for misquoting the late Stephen Covey there is really two fold,” Taylor began. “The first is that when you look at Decision Support Systems, and there’s obviously a long history there, people are often very unclear what decision it is, in fact, somebody’s going to make with the data.” He goes on to state that all sorts of data is put in without any thought to the eventual decision that will need to be made. Companies, he claims, expect the employee, whom they regard as smart and experienced will be able to use the data to arrive at a decision. “What happens when I try to make a decision about what offer to make to a customer who’s on the phone to the call center right now,” he posited. “[The employee] has seven seconds. And they were hired yesterday. And they got three hours of training. They’re not in a position to know what decision they should be making.” He continued, “So, if you don’t know what decision you’re trying to embed the analytics into, you can’t do a good job with the analytics. I put [that principle] in because there was this sense that people were very (sic) lacksadaisical about what the decisions were they were supporting with their Decision Support Systems.”

Another of Taylor’s principles outlined in his book is to ‘Be Predictive and Not Reactive.’ Gillin stated, “I think you’re right. Using data to support decisions reactively is more intuitive.” He then asked, “What is the mind shift that is involved in moving toward predictive analytics?”

“It turns out to be one of those things where it’s very easy to build things that are predictive,” Taylor stated. “But if they don’t change people’s decision making behavior, they don’t help. Talk to anyone who does predictive analytics and they’ll tell you stories of building highly predictive models that didn’t change the business,” he explained. “You have to be clear how it’s going to affect the decision you’re going to make before you can build the predictive.”

One reason Taylor believes there has been resistance thus far is because the whole notion of predictive analytics shifts the operation of business from the realm of absolutes to the realm of probabilities. “If I’m measuring last month’s results,” he said, “I can give you an absolute number. I can tell you exactly what you sold last month.” Projecting demand for the future doesn’t give you that same certainty. “I can give you a probability or a range,” he stated. “You have to start dealing with a little bit of uncertainty. That’s why it’s important to wrap some rules around these predictions.” However, Taylor says we, as humans, operate in probabilities and chance subconsciously every day. Once that is understood, the mind shift to predictive analytics becomes easier to adopt.

Embracing the Opportunities of Unstructured Data

One key to building a robust predictive analytic structure is incorporating multiple data streams, including unstructured data. “There are a slew of startups getting funding around Business Intelligence and data warehousing,” Furrier pointed out. “That market is shifting. But how does loose data affect some of the opportunities,” he asked.

Taylor pointed out that it is a strength to be able to store data before you’ve figured out how you might use it. “That’s a key advantage,” he said. “I did some surveys recently on predictive analytics in the Cloud,” he explained. “We asked about some of these Big Data sources and what we found was that people who are getting value from Big Data and these unusual sources were people who had some experience with more advanced kinds of analytics.” He further explained that the combination of traditional data with data from e-mails, texts and other unstructured data could only serve to produce a more fine-grained model, improving accuracy. “Its ability to refine existing predictions is really strong. Right now, that’s the biggest use case I see in customers,” he stated.

For more on this interview and others broadcast by SiliconANGLE’s theCUBE from both of this week’s events, be certain to visit SiliconANGLE’s YouTube Channel.

(Originally published at SiliconANGLE.com)

photo credit: cali.org via photopin cc

Big Data is coming alive, sports leading the way | #SportsDataSV

origin_6942954396Big Data is coming alive in a very real way in Northern California. SiliconANGLE’s John Furrier and Wikibon’s Jeff Kelly sat down with top figures of the professional sports teams that call the Silicon Valley area home and discussed how their teams have embraced emerging technologies aimed at achieving successful front office operations, providing enhanced player and draft statistics on the field and improving the overall fan experience.

Tech-Targeting Future Fans

 .

The idea that Big Data will be used in that third avenue is going to be one of the drivers for future wider adoption by the Enterprise in other industries. The near seamless integration of the technology in stadiums aimed at the fan user will appear similar to the rapid and widespread acceptance and adoption of mobile technology over the past decade. Once the general public is able to get their heads around what is now an amorphous concept, via the sports arena, there will be an increased pressure demand for the technology in other industries aimed at the customer level. And that can only be achieved, as was voiced in most of the interviews, when the organization as a whole works top to bottom to use the new technology to both streamline and add depth to their entire organization.

KovalDuring the interview featuring Dave Koval, President of the MLS San Jose Earthquakes, the more interesting take away was the realization that though each interview dealt with a major league sport, there is no one-size fits all approach for overall implementation. Koval cited specific technology requirements based on their fan demographic. Speaking to the organizations philosophy on tech Koval said, “I see it as an enabler. It helps create a better experience. One-third of our fans are millenials who only interact via their mobile. If you don’t support that and their needs, they turn off.”

But perhaps the most interesting innovation on the fan front is how the Earthquakes have endeavored to communicate with current and potential fans. “We have a Business Information unit with data scientists taking all of the information they can from soccer fans and bringing them together around the Quakes,” Koval stated. “We even look at the [style of] language they use online and try to ascertain their personality to tailor our communications to them. It allows us to be able to market to them one-on-one.”

Focusing On The Fan Experience

 .

GarlandIf football is more your cup of tea, you’ll want to go to the SiliconANGLE channel on YouTube to watch the interview with Doug Garland, General Manager of Stadium Experience and Technology for the San Francisco 49ers. It appears they have taken a page from NASCAR with regard to improving overall fan engagement and their future initiatives are, no doubt, going to revolutionize the fan experience in a big way.

Garland provided an interesting equation during the conversation. Citing the 160-foot wide Mitsubishi HD installed at AT&T Stadium in Arlington, Texas, he stated no such centerpiece would be going into the new Levi Field. The stadium, with a capacity of 70,000, is located in a region of the country where users replace mobile devices every year and a half at a price of $1,000. Rather than, as a team, spending $70 million on a large centerpiece screen, the 49ers opted to focus on putting together a network, including WiFi, that would utilize the screens nearly every fan is carrying with them.

Watch the interviews in their entirety here:

At the new Levi Stadium, the fan experience via mobile device will begin before driving onto the property. From directing the fan to the stadium and to an available parking area through real-time data to letting them know which bathroom has the shortest lines, the fan’s handheld device will be key to providing an improved experience.

“What else are we going to do? One of the other areas that we know is a big hassle for fans at the game is waiting in long lines at concession stands. If you go to a football game and you get hungry, you are trying to figure out which part of which quarter do I have to miss so I can go get that hot dog,” Garland explained. “That’s a trade-off we don’t want our fans to have to make. So what we are going to enable at Levi Stadium is the ability for any fan sitting in any section of the stadium to order food and beverage to their seat using their mobile device.”

But the football fan experience would be nothing without the beloved replays. The San Francisco 49ers has the “competition on the couch” firmly in its crosshairs with a value-add they are perfecting right now for the new stadium. Their new innovation will be replays streamed to each and every app-enabled mobile device in the stadium within 5-seconds after every play. As Garland notes, this will apply to every play, even plays with controversial calls and plays currently under review.

After Moneyball

 .

SchloughAn analysis of sport and Big Data that doesn’t bring the data-heavy sport of baseball into the mix would be an incomplete analysis. The importance of data on the game was highlighted by the 2011 film, Moneyball. When asked about the tension portrayed in the film between the old school scouts and the data geeks, Bill Schlough, Senior VP and CIO of the San Francisco Giants said, “I think that movie was closer to reality than a lot of people think. What’s interesting about old school scouts is that they rely on data as much as the new guys. They don’t always realize they are doing it though. They look at biomechanics.” He continued,”We are now quantifying what was once unquantifiable. That information was in the heads of the scouts. Now we have computational data models.”

While the 49ers and the Earthquakes are able to embrace and implement emerging technology from scratch with the new construction of stadiums, the Giants are in, although nationally recognized for its design and vistas, a 14 year old facility. Upgrading in an existing stadium presents its own challenges. Schlough, echoing a sentiment from every other guest of the evening, stated the importance of providing a robust wifi and 4G LTE environment for the fans. Implementation of wireless technology requires a lot of behind the scenes wiring and upgrades. However, if a fan can’t use a technology that is ubiquitous in their day-to-day lives at a sports facility then the overall fan experience is significantly diminished.

The Data Driven Front Office

 .

TortoraFocusing on Big Data in the front office, John Tortora, the COO of the San Jose Sharks discussed several key areas the operation is improved by the embrace of new technology. It should be noted the principal owner of the Sharks is Hasso Plattner, co-founder of SAP. So it should come as no surprise that the business operation and customer service software is SAP branded. They use the software products for tailored employee evaluation and more efficient inter-office collaboration.

Tortora also pointed out the NHL has recognized the importance of Big Data at their league offices and via their weekly and bi-weekly analytics reports, individual teams are able to evaluate their own ticket and box sales and compare their volume and price points to each and every other team in the league.

We are standing on the precipice of a bold new perspective on the conduct of business. With the recognition of value for their own organizations and for the individual fan, sport organizations like the NHL, MLB, NASCAR, NFL and MLS are gently bringing their fans in for a soft landing to effect a seamless understanding and acceptance of concepts that, for now, are still murky to many in the general public and even some in the business world. The excitement that is brimming just under the surface is poised to soon revolutionize almost every aspect of our daily lives and, for now, it appears sport is leading the way.

(Originally published at SiliconANGLE.com)

Viewing The World Through A PRISM

large_400175017The Edward Snowden saga has been playing out dramatically across the front pages of newspapers and nightly news worldwide. And the effects of his very public revelation of the National Security Agency’s (NSA) surveillance program, PRISM, still have not been fully realized. However, enough time has passed since he first sat down with Glenn Greenwald of The Guardian that a reexamination of the events seems in order.

Here’s a quick re-cap of what we do know. The NSA has, since 2007, been provided direct access to troves of data collected by for-profit companies like Google, Yahoo!, Microsoft, AOL and others. The contents of that data include e-mails, search histories, live chats and file transfers. While the law that established the PRISM program claims only non-US citizens are subject to surveillance, Americans who communicate with anyone outside the United States are not exempted.

Each of the companies associated with PRISM initially offered flat-out denials they, in any way, cooperated with, participated in, or even knew of the existence of the NSA surveillance program. However, those denials became hollow earlier this month when it was reported Microsoft had helped the NSA circumvent their own encryption process which aided the agency in being able to intercept live web chats on Microsoft’s Outlook.com portal. Microsoft also worked with the agency to provide easier access to their cloud storage service, SkyDrive.

Since the idea of a government run surveillance program went from probability to confirmed reality, there have been statements and opinions offered by government officials, pundits, security analysts and citizens, alike. The legality of this program will have to be fought out by constitutional scholars and lawyers. For the rest of us, let’s look at the nature of privacy and how our being ever-more-tied to our mobile and digital devices strips away our anonymity.

Does Privacy Exist Anymore?

In a recent study, published in the March edition of Nature’s journal Scientific Reports, researchers pored over mobility data for 1.5M individuals based on their mobile providers antennas. They determined, when refreshed hourly, four spatio-temporal points are all that is needed to be able to uniquely identify 95 percent of the individuals. Their study, entitled ‘Unique in the Crowd: The privacy bounds of human mobility’, paints an interesting conundrum on how future frameworks must be created in order to protect the privacy of individuals.

As they claim in their introduction, “…the notion of privacy has been foundational to the development of our diverse societies, forming the basis for individuals’ rights such as free speech and religious freedom. Despite its importance, privacy has mainly relied on informal protection mechanisms.” The team highlights an important 19th Century publication by Samuel Warren and Louis Brandeis, brought about by photography and yellow journalism, which argued privacy law must evolve in response to technological changes.

This fact remains especially true in a world dominated by modern information technologies like the Internet and mobile phones. Mobility data has been used previously for research purposes as well as to provide personalized services to users. However, a sufficiently motivated organization could use such data to, for instance, track movements of a competitor’s sales force, determine an individual’s place of worship, or even know when someone has been at a particular motel or abortion clinic. The research team, for this reason, suggests the maintenance of individual privacy, in the age of the smartphone, requires the individual to engage in idiosyncratic movement.

Angered Allies

Immediately after the PRISM program was brought to light, the US faced a backlash from many of our European allies. They claimed they were none too happy about the United States possibly spying on their own citizens. In an article published in Germany’s Der Spiegel, it was reported more than 20 million German phone connections and 10 million Internet data sets are monitored by the NSA on an average day. Busier days saw the 20 million figure jump as high as 60 million.

In the very same article, however, allegations of complicity on the part of the German government were leveled. The published interview, conducted with Snowden prior to his becoming the public face of the scandal, states, “The partnerships are organized in a way so that authorities in other countries can “insulate their political leaders from the backlash” in the event it becomes public “how grievously they’re violating global privacy.” Prior to its publication, German Chancellor Angela Merkel had decried the revelations, likening them to “Cold War” tactics.

While Britain has their own surveillance program, codenamed ‘Tempora’, and a publication in the French daily Le Monde stated that country was also engaged in a widespread surveillance scheme, German citizens have reason to express such sensitivities where spying and surveillance are concerned. Their not-too-distant past has examples of intrusive surveillance and snooping in the former communist German Democratic Republic and also during the Nazi era.

The Battle In Britain

The aforementioned ‘Tempora’ program, operated by Britain’s GCHQ eavesdropping agency, is known in the intelligence world as a “full take”. As Snowden detailed, “It sucks up all information, no matter where it comes from and which laws are broken. If you send a data packet and it goes through Britain, we’ll get it. If you download anything, and the server is in Britain, we’ll get it.”

In fact, when the NSA decides to target an individual, they virtually assume full control of a person’s data. Effectively, they take over an individual’s computer. That computer, as Snowden says, “…more or less belongs to the US government.”

This intrusion has led to lawsuits having been filed, both in the US and the UK, over an individual’s right to privacy. Privacy International filed suit against the British government, citing both PRISM and Tempora, which taps major internet cables around the world. As the privacy activist group states, the lack of a publically accessible legal framework for the NSA spying on British citizens, and then sending the resulting data to UK authorities, (a tact that would clearly be illegal if the British collected the data themselves), the legality of PRISM and Britain’s own Tempora program are called into question.

Privacy International’s research chief, Eric King says, “One of the underlying tenets of law in a democratic society is the accessibility and foreseeability of a law. If there is no way for citizens to know of the existence, interpretation or execution of a law, then the law is effectively secret. And secret law is not law. It is a fundamental breach of the social contract if the Government can operate with unrestrained power in such an arbitrary fashion.”

A similar lawsuit, filed by the American Civil Liberties Union in the United States, has effectively been quashed as the Obama administration asserted the FISA court has no requirement to publish its decisions, even those that address the constitutionality of mass surveillance programs.

Part 2 of this series, to be published Wednesday, July 31, will explore if and how large intelligence gathering schemes can be dismantled, what the future of governmental and private corporate surveillance means to you, and how you can take measures to protect your identity and privacy in this brave new world.

(originally published at SiliconANGLE.com)

What The Dell Is Going On In Austin?

large_6124798895DELL computer, since 1997, has consistently ranked as one of the top three PC companies, in terms of market share. Unfortunately for the PC market, sales have steadily been declining in favor of more mobile platforms, such as smartphones and tablets.

So what is a company whose bread and butter has been in the PC sales market to do? If you’ve been paying any attention to the computing world over the previous few months, you must be aware of the effort by Michael Dell and private equity firm Silver Lake to take DELL off of the public market and return it to a privately-held company.

Carl Icahn, one of the largest single shareholders in the company has been less than enthused by this plan and even set about to take over DELL with a leveraged buyout proposition of his own. The vote, originally scheduled for Wednesday of this week, has been postponed to Friday, August 2, in light of a final amended proposal by Dell and Silver Lake to over ten cents over the highest buyout offer so far.

In a letter to shareholders, Dell claimed, “After one of the most thorough processes in history, the highest price that any of the parties was willing to pay was $13.65 per share. Although no other party has offered to pay more than $13.65 per share, Silver Lake and I have now increased our offer to $13.75 per share, an increase to public shareholders of approximately $150 million, which is our best and final offer.”

Dell, in the same shareholder letter, claims the necessity to take the company private stems from a need to transform the company and transform it quickly. As noted in an article for ComputerWeekly.com, with the dramatic crash in sales of personal computers, Dell is looking to revive his company by focusing on a transition to a software and services firm.

Even amidst the boardroom drama unfolding before our eyes in real-time, DELL has announced their intentions to move in this direction, highlighting recent key updates to their OpenStack powered cloud services as well as the open source Hadoop services.

Additionally, DELL claims they intend to develop and support the Dasein open source project. Dasein is a Java-based cloud abstraction layer suited to application development.

“Dasein is the backbone of DELL Multi-Cloud Manager and is invaluable to users ranging from commercial software vendors to individual developers,” said George Reese, executive director of Cloud Computing at DELL.

Dell finished his letter to investors by asking that they use their vote, one way or the other. As the rules of the agreement stipulate, the buyout can only occur with a majority of the unaffiliated shares in the company. Unfortunately, the agreement also means any shareholder abstaining from voting is counted as a vote against the buyout. With greater than 25 percent of the shareholder votes still outstanding, the certainty of Dell’s buyout is in question.

photo credit: Gonzalo Merat via photopin cc

(Originally published at SiliconANGLE.com)