T-Talkin’ ‘Bout Your Reputation

An Ear To The Digital Ground

Those who have the term ‘marketing‘ anywhere in their job title know that the last few years have been an exercise in on-the-job learning, re-educating ourselves about the existence and importance of clicks, impressions, likes, shares, and mentions, to name but a few. Some view the new paradigm in marketing as a minefield to be cautiously navigated, with one misstep meaning the difference between accolades and the unemployment line. The more forward thinking marketers see this vast landscape as a new world, ripe with opportunities to shape and deliver their company’s brand. 

Today, we will talk about one opportunity in particular: Brand Mentions. Reading this, you will learn how:

  • Brand mentions are one of the most important (and least predictable) components to your marketing strategy.
  • To monitor brand mentions.
  • To recover customers using genuine brand mention interactions.
  • Brand mentions are a great barometer for reputation management.

Start Small, Go Big

First, let’s get this part out of the way: A brand mention is anytime a customer shares information online about your company and directly cites your company’s name in that post, tweet, blog, review, etc. Unsolicited comments, because they are unsolicited, can be one of the biggest drivers of trust your company could receive. Of course, that is if the unsolicited comment is affirmative of your product and/or service. It’s a different story if the comment is negative.

The folks over at HelpScout.net put out an excellent and exhaustive list of ‘75 Customer Service Stats and Quotes You Can’t Afford To Ignore‘ that details a lot of things many of us, seemingly intuitively, already know. It also points out some great opportunities for tightening up the service and marketing sides of your business.

For instance, we all know that a bad experience reaches more than twice as many individuals as a good experience. That statistic is all the more damning when you introduce the online space, where individuals have a social reach that typically exceeds what would even be possible from in-person interactions only.

People are out there talking about your company. Do you know what they are saying?

An Ear To The Digital Ground

Knowing people are talking about you but not knowing what they are saying, in our day-to-day lives, can be frustrating. The same situation in the life of our company can be downright dangerous. Learning how to get in on the conversation is the first step to protecting your company, building your brand, and deftly managing its reputation.

Andrew Dennis recently wrote for SearchEngineLand.com an excellent piece about how to best leverage individual brand mentions from a marketing and link-building perspective. In it, he highlighted a handful of great resources that can be used to effectively monitor how it is that your company is being discussed out there in the great big World Wide Web.

Google and their ever-improving algorithms are placing brand mentions and other user-initiated interactions into a more prominent position when it comes to site rankings based on both popularity and authority. It is only fitting that one of the first tools for monitoring your online brand mentions would come from the search engine behemoth, themselves. Even better, Google Alerts are a free option that presents you with e-mailed updates when certain selected terms find their way on to the Internet.

Certain online marketing SAAS platforms, like HubSpot or Mention, offer monitoring alerts that point you to conversations surrounding your brand occurring in real-time. These providers also present brand mentions in terms of actionable analytics, historical performance and relevance, and outreach opportunities.

Now that you know how to find your brand mentions in the online sphere, we can discuss your next steps once you are turned on to a conversation about your brand, both good and bad.

Keep Calm and Communicate On

There is no guarantee now that you have found people having discussions about your brand that you will necessarily like what they have to say. Negative brand mentions are initially off putting, but they present the greatest opportunity to reclaim a dissatisfied customer than in most any other online scenario. We’ll discuss turning negatives into positives shortly.

In a perfect world, every brand mention you encounter would be a glowing validation of your company, its products, and the service you provide. Nurturing positive brand mentions can go a long way to personalizing your company to existing and potential customers, driving brand loyalty through the roof. Positive brand mentions can be favorited, liked, retweeted, shared, posted, etc. Taking just a little extra time to reach out to the individual who took time to mention your company reaffirms to the customer the value which you hold for them. It also provides you with a cost-free avenue to reap the opinions and feedback of your most loyal brand advocates.

Additionally, if a positive brand mention is made on a personal/professional blog or within a news story, you should feel absolutely comfortable with reaching out to the author, site, or publication and proposing link sharing with them. Link sharing between reputable sites builds the online authority of both entities.

Understandably, not every mention of your brand is going to be a glowing endorsement of your company. Remaining unaware of, or willfully ignorant to, negative brand mentions can be perilous, even for large, established brands. Not every negative brand mention is a crisis, however. Remember, the individual who is unhappy is (wait for it) a person. Engage in a personable and respectful dialogue with the individual and you may be surprised at how quickly you can turn that frown upside down. Our human nature drives us not only to want to be heard, but to be heard by the right people. A recovered customer, because of the import you give to their feelings, can often turn into one of your most vocal and supportive brand advocates.

The Theory of Reputation

Today’s “Always On” society has brand marketers working harder, longer, and faster than they did even 10 years ago. Platforms like Yelp!, Facebook, and Twitter, paired with blogs and online publications, have shown us just how fast a customer service fail or marketing misstep can snowball out of control. Providing yourself with the knowledge and tools to proactively protect your brand reputation just makes sense.

For an online marketing strategy to be effective, today’s business owner has to abandon the “set it and forget it” mentality that led many in the early days to pay $1000s to establish a web presence and then let it exist in a purely static state. Active engagement is required in order to build your brand with your customers.

If you have practiced other strategies geared toward reputation management, keep the conversation going and share your experience in the comments section below.

About Intellibright

Intellibright was founded to provide their many partners, both mid-sized and large, a recognition of the value in promoting a strong online presence in driving traffic, leads and sales. Our founder, Ron R. Browning, came from top-tier financial services companies where he was singularly responsible for ad spend budgets in excess of $1M. Intellibright provides and maintains fully optimized web presences, based on strict data parameters derived from deep SEO/SEM research, targeted advertising, and regularly updated original content. If you have questions about how Intellibright might help your company with their online marketing needs, call or click today!

(Originally published at: http://blog.intellibright.com/t-talkin-bout-your-reputation )

No more ‘army of one’ with Big Data accountability revolution | #MITIQ

091013-A-7540H-016At this week’s MIT CDOIQ Symposium, held in Cambridge, Massachusetts, you wouldn’t have been faulted for recognizing that two industries that have embraced the Big Data revolution are healthcare and the financial sector. The benefits of employing analytics models to these fields are more and more apparent. You might be surprised, however, to learn of one particular public sector player that is seeking an operational advantage via Big Data: the Department of Defense.

Mort Anvari, Director of Programs and Strategy within the Deputy Assistant Secretary of the Army’s Cost and Economics division, has overseen the creation and implementation of directives aimed at the cost culture and cost management of the US Army.

“Private industry is easier because everyone is cost conscious,” Anvari explains. “In government, particularly during a war, that mission is driving everything. Most of our officers are only concerned with how much money they need and how they spend it.” Anvari admits that his task of directing commanders to accede to a cost culture mindset was a pretty big deal. “We had to look at it from the people’s perspective. That includes convincing leadership that attention to cost doesn’t create a bad image for the country.” The main argument centered on the perception that being cost conscious could appear to be putting our servicemembers unnecessarily into greater harm’s way.

 

Surviving the Culture Clash

Anvari claims an early success with the education of commanders and other officers that paying special attention to cost and soldier safety were not mutually exclusive. “You can be cost conscious. You can do more with your resources. You can be more efficient with it and still care about safety and care about the soldiers.”

With projects budgeted above $10 million, an automatic cost/benefit analysis is enacted. Anvari oversees more than 2,000 Army analysts who perform and validate each of these. Knowing the process, commanders will typically consider an additional course of action to what they submit for review. “Talking to commanders and leadership, we ask ‘What is your information need?’,” he stated. “Based on that, we develop a data need for that organization.”

Similar to challenges faced in the private sector, one of the issues overcome by the Army was convincing certain organizations that possessed data to share that property with other organizations. “Communicating the data need from organization A to organization B, telling organization B you need to provide this data that is not for you, it’s for someone else,” Anvari said, “that was a big culture shock.”

However, in explaining the funding structure for the military like an upside down tree, Anvari was able to bring an understanding that all funding came from the top and spread out to all of the “branches” within the Army. “We call it fund centers. They have all the money. The cost centers are the ones using this money. It could be them or it could be others. It’s truly like a neural network of information.”

Unlike private sector companies, the Army realized they had to streamline their budgeting and allocation process due to the fact they are subjected to strict oversight by the Congress. As all of the funding is taxpayer monies, citizens are also privy to the budgeting process through the use of Freedom of Information Act requests.

As noted above, the cost-benefit analysis process is automatic on projects in excess of $10 million. However, Anvari notes that projects under that threshold, undertaken by commanders of smaller outfits, are pretty well self-monitored because those commanders want to show that they are capable of critically applying the new cost culture analysis in the hopes they will be promoted to heading up larger projects in the future.

“Accountability is hard to swallow,” says Anvari, with regard to the early push back from Army leadership, “no matter how normalized the process is.” Anvari’s work is seeing results, however. “Cost management is on its feet and working,” he concluded.

(Originally published at SiliconANGLE.com)

photo credit: The U.S. Army via photopin cc

You may not need Big Data after all | #MITIQ

WarningThe business buzzword over the past two years has been “Big Data”. Companies are trying to figure our how they can leverage their collected data and translate it into a competitive advantage. However, according to the Director of MIT’s Sloan School Center for Information Systems Research, Jeanne Ross, this approach is not necessarily a one-size-fits-all for today’s organizations.

Ross, co-author of the article ‘You May Not Need Big Data After All’, cautions businesses against buying into the hype around Big Data.

“I think you grow into Big Data,” Ross notes. She explains that there are companies who find the competitive advantage works within their specific industries. As an example, she notes that the oil and gas industry has long employed Big Data for helping them to decide when and where they should place a billion dollar well. The success in one industry, however, doesn’t necessarily translate into success in others. “Many times we know great things about our customers. We just haven’t figured out a way to address them.”

When asked if the fear is misplaced that some companies feel in that they can’t address the Big Data they have, Ross states, “No, not misplaced at all. If you don’t think you can do it, you probably can’t.” For organizations recognizing the potential value of Big Data for the first time, this news could be disheartening.

“I don’t think most companies are data-driven,” explains Ross. “I think they are metric driven.”

This differentiation is important. Today’s companies can respond to certain kinds of data but in order to truly be a data-driven organization, they have to recognize which data is important. As an example, Ross cites Foxtel, a pay TV service based out of Australia.

“They saw what products were going out and what channels people wanted,” she states. Even with that information they were unable to make strategic decisions. “They went back and started looking at segments and realized what ‘data driven’ would be. They didn’t have the stomach to go back and do that.”

Where the CDO fits in

Discussing the emerging role of the CDO, Ross explained that too often there is a propensity to assume that once a CDO is brought into an organization all data issues will be addressed by that role and that little to no further attention is required. With Gartner projecting a 25 percent adoption of a CDO role in companies by next year, Ross claims most companies likely don’t need to create this position.

The key to running a successful organization is identifying and maintaining a single source of truth with respect to data. Many divisions within a company will manipulate data to show that they are running at a profit or contributing significantly to the organization’s bottom line. In the long run, this can be detrimental to the company because different data can show different outcomes.

Once companies adopt a single source of truth in their data, Ross believes it is of utmost importance that it is adopted in a top-down strategy. “We have to let people know mistakes have to be made. The faster you make mistakes, the more you can learn and the faster you can grow.” This strategy is ineffectual, however, if you start in the middle of the organization as people will be less willing to admit mistakes and failure if it hasn’t been adopted into the company’s cultural model.

The swiftly moving current of technology, especially over the previous five years, should be viewed critically by companies hoping to somehow gain a competitive advantage. Leveraging Big Data requires more than just a willingness to throw money at the problem. It requires a full understanding on the part of the company as a whole.

(Originally published at SiliconANGLE.com)

photo credit: Free Grunge Textures – www.freestock.ca via photopin cc

Exploring the emerging role of the CDO | #MITIQ

telescopeSince 2006, the Massachusetts Institute of Technology has hosted their Chief Data Officer and Information Quality symposium, highlighting the emerging role of data as a significant driver for revenue generation within an organization. The role of the CDO is the most recent iteration of the evolution of the data steward.

Writing for Wired, Bob Leaper of DST Global Solutions detailed that evolution from the early 80′s through to today. The Data Processing Manager gave way to the Chief Information Officer, placing an individual with computational know how into the boardroom. Even that ascendancy, he claims, still left a company’s data footprint and how best to utilize it somewhat in the shadows. The CDO’s role was created as a means of bridging the gap between the IT department and the company’s operations.

SiliconANGLE’s theCUBE, broadcasting now for the 2nd year from #MITIQ, will be detailing the responsibilities for this relatively new role as they slowly solidify. To that end, the live broadcast will feature interviews with academics, executives, thought leaders, experts and bloggers over the next two days.

In theCUBE’s kickoff for the event, co-host Jeff Kelly commented on how the role of the CDO is moving in the direction of being a more accepted position in the enterprise. In light of recent research by Gartner which claimed 25 percent of companies would have appointed a CDO by the end of next year, Kelly believes the role will be critically important for modern companies and that it will maintain that level of importance even a decade on from today.

With recent revelations of how some companies are utilizing their customer’s data, one newly important function of the role of the CDO will be in minimizing risk within the organization. While the goal of any company is to analyze their data in such a way that it drives revenue for the organization, the CDO will take on the added responsibility of ensuring all analysis is performed to the highest standard of ethics. This is because any perceived misuse or customer data and the backlash it creates will fall squarely on the shoulders of the company’s CDO.

Be sure to join Dave Vellante, Jeff Kelly and Paul Gillin all this week for the live broadcast of theCUBE at siliconangle.tv.

photo credit: Pierre Metivier via photopin cc

(Originally published at SiliconANGLE.com)

Every company is a media producer, says Ustream founder | #IBMimpact

ON AIR signSiliconANGLE’s live broadcasts of theCUBE are facilitated by Ustream.tv. Joining John Furrier and Paul Gillin at this week’s IBM Impact conference at Las Vegas’ The Venetian Resort and Casino was the CEO and founder of Ustream, Brad Hunstable.  The conversation discussed the importance of media production in the Enterprise, the capability of Ustream to safely and effectively transmit media for organizations and what the future holds for his company.

Ustream is currently the largest HD-capable live streaming option on the market. As Hunstable notes, “We started humbly and now have grown into a large provider for businesses. We built it completely from the ground up.”

Hunstable shared the origin of Ustream, which he created while he was deployed in the military so he would be able to watch his brother’s different band performances.

Watch the interview in its entirety here:

“I think where the growth in video is happening,” stated Hunstable, “is within the enterprise.” Enterprise customers are looking to Ustream because they require a full solution and Hunstable and his company can offer that to them. “The beauty of the Cloud is that you can, regardless of [company] size, quickly try different solutions and change or adopt,” he noted.

The reason Hunstable sees tremendous opportunity in the enterprise is evident. “Nestle produces more hours of content than all of Hollywood combined,” he said. “They viewed over 1 billion hours of video live.” This refers to events like corporate messaging aimed company-wide or at specific divisions among many other uses of media.

“Enterprise is a tremendous opportunity,” he stated. “We are going to continue to provide robust solutions for these companies to suit their specific needs. We come from a consumer background and want to bring that knowledge to the enterprise,” he continued. “The reality is that every company is a media producer. They are creating content in many different ways and that helps them reach their customers on a more personal level.”

While the opportunities for growth are at the enterprise level, Hunstable points out that his full solution is able to be utilized by smaller companies as well. “[Our solution] serves the needs of the entire Enterprise. That means small business should get the same consideration as the enterprise.”

The considerations he refers to have to do with a product that offers security and scale. “They want assurance that your product is safe and that their info can’t be compromised,” said Hunstable. “They also want scale. They want reliability.”

Hunstable cites the simple platform for Ustream’s success. “You can try before you buy. You can start off immediately,” he said. “And if you are a larger company with greater needs, you can talk with one of our sales associates and we can help get you started.”

(Originally published at SiliconANGLE.com)

photo credit: katielips via photopin cc

Predictive analytics stepping front & center in the business world | #ibmimpact

Crystal Ball BusinessmanThis week, SiliconANGLE’s theCUBE broadcast from both the ServiceNow Knowledge 2014 event at the Moscone Center in San Francisco and the IBM Impact Conference held at Las Vegas’ Venetian Resort and Casino. Helming theCUBE desk for IBM Impact were John Furrier and Paul Gillin. On Day 1, they welcomed the CEO and Principal Consultant for Decision Management Solutions, James Taylor.

Taylor’s biography explains that he is a 20-plus year veteran in the field of Decision Management and is regarded as a leading expert in decisioning technologies. He is also the author of Decision Management Systems: A Practical Guide to Using Business Rules and Predictive Analytics.

At the start of the conversation, Furrier noted how we are in an age that really should be considered one of the most dynamic times for IT. He cites a recognition by the top line of business focusing on utilizing IT for revenue and business growth and no longer employing their IT simply as a means of cost reduction. He attributes this to significant converging trends that are changing the role of IT, like increased speed and agility, unlimited compute in the Cloud and the fact everything is now instrumentable.

Interjecting on this thought, Taylor stated, “I think what has really changed is the acceptance of analytics. When I wrote the book, people were uncertain about it.” He continued, “There were ways to use small data that weren’t predictive analytics. If you don’t process it and turn it into a usable prediction, it’s hard to consume it.” As the landscape has changed, he believes predictive analytics is stepping to the front and center in the business world.

Watch the interview in its entirety here:

Much of that change is being driven by the habits and expectations of the emergence of a more technologically savvy consumer base. Talking about his own 25-year-old son having to purchase auto insurance, Taylor said, “When he gives you his data, he expects a quote. If you say you’ll let him know what the quote is, he’ll just go somewhere else. He wants the answer now.” This expectation requires real time responses even when the proposition is relatively complex and involves assessments of risk. “You have to use analytics to determine how risky he is and you have to answer now.” Taylor claims that capability has to be embedded not only into call center scripts but also into a company’s mobile applications and website where no human-to-human interaction ever occurs. “Because if you don’t, you’re missing the point,” he stated.

Understanding the Principles of Decision Management

Much of the rest of the conversation centered on Decision Management principles Taylor outlined in his book. The first such principle, ‘Begin With The Decision in Mind’ was recounted by Gillin. “That struck me as kind of obvious,” he said. “Isn’t that how you would go about this? Obviously there is a reason why you said that. Do you find that people typically don’t,” he asked.

“The reason for misquoting the late Stephen Covey there is really two fold,” Taylor began. “The first is that when you look at Decision Support Systems, and there’s obviously a long history there, people are often very unclear what decision it is, in fact, somebody’s going to make with the data.” He goes on to state that all sorts of data is put in without any thought to the eventual decision that will need to be made. Companies, he claims, expect the employee, whom they regard as smart and experienced will be able to use the data to arrive at a decision. “What happens when I try to make a decision about what offer to make to a customer who’s on the phone to the call center right now,” he posited. “[The employee] has seven seconds. And they were hired yesterday. And they got three hours of training. They’re not in a position to know what decision they should be making.” He continued, “So, if you don’t know what decision you’re trying to embed the analytics into, you can’t do a good job with the analytics. I put [that principle] in because there was this sense that people were very (sic) lacksadaisical about what the decisions were they were supporting with their Decision Support Systems.”

Another of Taylor’s principles outlined in his book is to ‘Be Predictive and Not Reactive.’ Gillin stated, “I think you’re right. Using data to support decisions reactively is more intuitive.” He then asked, “What is the mind shift that is involved in moving toward predictive analytics?”

“It turns out to be one of those things where it’s very easy to build things that are predictive,” Taylor stated. “But if they don’t change people’s decision making behavior, they don’t help. Talk to anyone who does predictive analytics and they’ll tell you stories of building highly predictive models that didn’t change the business,” he explained. “You have to be clear how it’s going to affect the decision you’re going to make before you can build the predictive.”

One reason Taylor believes there has been resistance thus far is because the whole notion of predictive analytics shifts the operation of business from the realm of absolutes to the realm of probabilities. “If I’m measuring last month’s results,” he said, “I can give you an absolute number. I can tell you exactly what you sold last month.” Projecting demand for the future doesn’t give you that same certainty. “I can give you a probability or a range,” he stated. “You have to start dealing with a little bit of uncertainty. That’s why it’s important to wrap some rules around these predictions.” However, Taylor says we, as humans, operate in probabilities and chance subconsciously every day. Once that is understood, the mind shift to predictive analytics becomes easier to adopt.

Embracing the Opportunities of Unstructured Data

One key to building a robust predictive analytic structure is incorporating multiple data streams, including unstructured data. “There are a slew of startups getting funding around Business Intelligence and data warehousing,” Furrier pointed out. “That market is shifting. But how does loose data affect some of the opportunities,” he asked.

Taylor pointed out that it is a strength to be able to store data before you’ve figured out how you might use it. “That’s a key advantage,” he said. “I did some surveys recently on predictive analytics in the Cloud,” he explained. “We asked about some of these Big Data sources and what we found was that people who are getting value from Big Data and these unusual sources were people who had some experience with more advanced kinds of analytics.” He further explained that the combination of traditional data with data from e-mails, texts and other unstructured data could only serve to produce a more fine-grained model, improving accuracy. “Its ability to refine existing predictions is really strong. Right now, that’s the biggest use case I see in customers,” he stated.

For more on this interview and others broadcast by SiliconANGLE’s theCUBE from both of this week’s events, be certain to visit SiliconANGLE’s YouTube Channel.

(Originally published at SiliconANGLE.com)

photo credit: cali.org via photopin cc

Millionaire Who Beat His Girlfriend On Video Receives No Jail Time, Just A $500 Fine

large_3034659459What would you buy if you had $500 burning a hole in your pocket? Maybe a Sony 40” Smart HDTV? How about a really nice dinner for you and three of your friends? Or you could take the family on a staycation to your nearest Six Flags theme park. Whatever you decide, there are near limitless options for how you might spend half a grand.

Gurbaksh “G” Chahal found an especially unique way to spend his $500. His money bought him a 30-minute all-out assault on his now ex-girlfriend. For a full half-hour, Chahal punched, kicked, dragged and choked this woman, taking time out occasionally to inform her that he was going to kill her. According to reports, in that 30-minute time period, Chahal struck his then-girlfriend 117 times. That number seems oddly specific. Perhaps it is because the entire episode was caught on video by his own home surveillance system. Upon seeing the video, the arresting officers found cause to charge Chahal with 45 felony counts.

It should be noted, as recent cases of affluenza have highlighted how our legal system seems to have become even more perverted in its exercise of justice, that Chahal is a northern California tech guru and millionaire who heads the advertising platform, RadiumOne. So, yes, if you are a person of means, you can now assault people on camera and walk away with the slap on the wrist that is probation and a piteously punitive cash fine.

The events detailed above occurred in August of 2013. As the trial got underway, the prosecution ran into their first brick wall: the victim changed her mind and opted not to testify against Chahal. The state’s case was weakened further when the judge ruled the video recording of the attack inadmissible. The police removed the recording from Chahal’s residence on the night of the beating, fearing if they left it, Chahal would most certainly destroy the evidence. Still, the judge deemed that removal, performed without a warrant, meant the prosecution could not use it to convict Chahal.

I think you’d be hard pressed to find anyone who saw the possibility of a lengthy prison term wiped away in favor of probation and a small fine who wouldn’t slink away quietly and reflect on how their previous actions might not reflect who they actually want to be. Not Chahal, however. Despite actually having had to have pled guilty to two misdemeanor counts (domestic violence and battery), Chahal launched a Twitter offensive yesterday going after all his haters.

Is the internet this stupid to read one side of the story by tabloids vs. the actual truth? Grow up people before u judge false allegations.

— Gurbaksh Chahal (@gchahal) April 25, 2014

For the last 10 months there were overblown allegations against me because of my alleged high-profile status.

— Gurbaksh Chahal (@gchahal) April 25, 2014

I got cornered to accept a misdemeanor plea with a $500 fine to resolve the matter and move on with my life .

— Gurbaksh Chahal (@gchahal) April 25, 2014

Rather than continuing a political witch hunt for another year attempting to fully clear my name.

— Gurbaksh Chahal (@gchahal) April 25, 2014

I maintain my innocence regarding these exaggerated allegations.

— Gurbaksh Chahal (@gchahal) April 25, 2014

Chahal and his behavior are not unknown in the tech world. Since the furor over his conviction and fine, there has been a steadily growing chorus calling for his resignation as CEO of RadiumOne. Chahal, defying his detractors, insists he is going nowhere. However, if a certain venture capitalist has his way, Chahal may not be the one making that decision. Jason Calacanis took to Twitter to encourage the public shaming of Chahal for a cause. He has offered to make a donation to a domestic violence charity in the amount of $10,000 should anyone leak the footage of the attack. Should that happen, Chahal will certainly learn that the Internet citizenry is far less forgiving than the legal system that let him off with a warning.

Though viral attention to this story is barely 24 hours old, there are many who, only able to view the world through a liberal v. conservative prism, have tried to make hay over the fact Chahal was a fairly prominent fiscal supporter of the Democratic Party. While his political leanings are in no way a predicate for being a purveyor of domestic violence, it should be noted that the Democratic National Committee has returned Chahal’s 2014 $20,000 contribution.

photo credit: alles-schlumpf via photopin cc

(Originally published at addictinginfo.org)

Big Data is coming alive, sports leading the way | #SportsDataSV

origin_6942954396Big Data is coming alive in a very real way in Northern California. SiliconANGLE’s John Furrier and Wikibon’s Jeff Kelly sat down with top figures of the professional sports teams that call the Silicon Valley area home and discussed how their teams have embraced emerging technologies aimed at achieving successful front office operations, providing enhanced player and draft statistics on the field and improving the overall fan experience.

Tech-Targeting Future Fans

 .

The idea that Big Data will be used in that third avenue is going to be one of the drivers for future wider adoption by the Enterprise in other industries. The near seamless integration of the technology in stadiums aimed at the fan user will appear similar to the rapid and widespread acceptance and adoption of mobile technology over the past decade. Once the general public is able to get their heads around what is now an amorphous concept, via the sports arena, there will be an increased pressure demand for the technology in other industries aimed at the customer level. And that can only be achieved, as was voiced in most of the interviews, when the organization as a whole works top to bottom to use the new technology to both streamline and add depth to their entire organization.

KovalDuring the interview featuring Dave Koval, President of the MLS San Jose Earthquakes, the more interesting take away was the realization that though each interview dealt with a major league sport, there is no one-size fits all approach for overall implementation. Koval cited specific technology requirements based on their fan demographic. Speaking to the organizations philosophy on tech Koval said, “I see it as an enabler. It helps create a better experience. One-third of our fans are millenials who only interact via their mobile. If you don’t support that and their needs, they turn off.”

But perhaps the most interesting innovation on the fan front is how the Earthquakes have endeavored to communicate with current and potential fans. “We have a Business Information unit with data scientists taking all of the information they can from soccer fans and bringing them together around the Quakes,” Koval stated. “We even look at the [style of] language they use online and try to ascertain their personality to tailor our communications to them. It allows us to be able to market to them one-on-one.”

Focusing On The Fan Experience

 .

GarlandIf football is more your cup of tea, you’ll want to go to the SiliconANGLE channel on YouTube to watch the interview with Doug Garland, General Manager of Stadium Experience and Technology for the San Francisco 49ers. It appears they have taken a page from NASCAR with regard to improving overall fan engagement and their future initiatives are, no doubt, going to revolutionize the fan experience in a big way.

Garland provided an interesting equation during the conversation. Citing the 160-foot wide Mitsubishi HD installed at AT&T Stadium in Arlington, Texas, he stated no such centerpiece would be going into the new Levi Field. The stadium, with a capacity of 70,000, is located in a region of the country where users replace mobile devices every year and a half at a price of $1,000. Rather than, as a team, spending $70 million on a large centerpiece screen, the 49ers opted to focus on putting together a network, including WiFi, that would utilize the screens nearly every fan is carrying with them.

Watch the interviews in their entirety here:

At the new Levi Stadium, the fan experience via mobile device will begin before driving onto the property. From directing the fan to the stadium and to an available parking area through real-time data to letting them know which bathroom has the shortest lines, the fan’s handheld device will be key to providing an improved experience.

“What else are we going to do? One of the other areas that we know is a big hassle for fans at the game is waiting in long lines at concession stands. If you go to a football game and you get hungry, you are trying to figure out which part of which quarter do I have to miss so I can go get that hot dog,” Garland explained. “That’s a trade-off we don’t want our fans to have to make. So what we are going to enable at Levi Stadium is the ability for any fan sitting in any section of the stadium to order food and beverage to their seat using their mobile device.”

But the football fan experience would be nothing without the beloved replays. The San Francisco 49ers has the “competition on the couch” firmly in its crosshairs with a value-add they are perfecting right now for the new stadium. Their new innovation will be replays streamed to each and every app-enabled mobile device in the stadium within 5-seconds after every play. As Garland notes, this will apply to every play, even plays with controversial calls and plays currently under review.

After Moneyball

 .

SchloughAn analysis of sport and Big Data that doesn’t bring the data-heavy sport of baseball into the mix would be an incomplete analysis. The importance of data on the game was highlighted by the 2011 film, Moneyball. When asked about the tension portrayed in the film between the old school scouts and the data geeks, Bill Schlough, Senior VP and CIO of the San Francisco Giants said, “I think that movie was closer to reality than a lot of people think. What’s interesting about old school scouts is that they rely on data as much as the new guys. They don’t always realize they are doing it though. They look at biomechanics.” He continued,”We are now quantifying what was once unquantifiable. That information was in the heads of the scouts. Now we have computational data models.”

While the 49ers and the Earthquakes are able to embrace and implement emerging technology from scratch with the new construction of stadiums, the Giants are in, although nationally recognized for its design and vistas, a 14 year old facility. Upgrading in an existing stadium presents its own challenges. Schlough, echoing a sentiment from every other guest of the evening, stated the importance of providing a robust wifi and 4G LTE environment for the fans. Implementation of wireless technology requires a lot of behind the scenes wiring and upgrades. However, if a fan can’t use a technology that is ubiquitous in their day-to-day lives at a sports facility then the overall fan experience is significantly diminished.

The Data Driven Front Office

 .

TortoraFocusing on Big Data in the front office, John Tortora, the COO of the San Jose Sharks discussed several key areas the operation is improved by the embrace of new technology. It should be noted the principal owner of the Sharks is Hasso Plattner, co-founder of SAP. So it should come as no surprise that the business operation and customer service software is SAP branded. They use the software products for tailored employee evaluation and more efficient inter-office collaboration.

Tortora also pointed out the NHL has recognized the importance of Big Data at their league offices and via their weekly and bi-weekly analytics reports, individual teams are able to evaluate their own ticket and box sales and compare their volume and price points to each and every other team in the league.

We are standing on the precipice of a bold new perspective on the conduct of business. With the recognition of value for their own organizations and for the individual fan, sport organizations like the NHL, MLB, NASCAR, NFL and MLS are gently bringing their fans in for a soft landing to effect a seamless understanding and acceptance of concepts that, for now, are still murky to many in the general public and even some in the business world. The excitement that is brimming just under the surface is poised to soon revolutionize almost every aspect of our daily lives and, for now, it appears sport is leading the way.

(Originally published at SiliconANGLE.com)

Researchers Develop Novel Compound From Fart Gas Which May Provide Real Health Benefits

large_2742657654A few years back, Kelly Clarkson, the first winner of television’s biggest talent show, American Idol, reminded us of the adage “what doesn’t kill you makes you stronger,” in her 2011 hit single ‘Stronger’. The unfortunate failure in that logic is that sometimes what you come up against will actually kill you. Today, we learn about researchers from the University of Exeter Medical School who have studied a potentially lethal substance for the remarkable health benefits it can provide.

Hydrogen sulfide, perhaps infamously known for its trademark odor of rotten eggs or a particularly foul brand of flatulence, is a heavier-than-air gas that, in the right concentrations, can asphyxiate those unlucky enough to find themselves in its noxious presence. To be certain, being trapped on an elevator with Earl from accounting while he wages an intestinal battle with last night’s pepperoni pizza will not be pleasant, but neither will it be a fatal ascendancy into the hereafter.

Twenty-five years ago this month, however, a dairy farming family tragically learned of the lethality of hydrogen sulfide when, at approximately 9am, five male members were overtaken by and succumbed to the oxygen robbing chemical compound. The incident occurred in the farm’s manure pit, an enclosed structure just off the cattle barn that had cattle feces electrically conveyed into it.

The tragedy began when the farmer’s 28-year-old son entered the pit to replace the shear pin on the agitator shaft. Accompanying him was his 15-year-old nephew who, upon seeing his uncle overcome, yelled to his younger brother outside the pit to go and get help. When help returned, the 15-year-old had also become unresponsive in the pit. While emergency services were en route, the farmer, his other son and the farmer’s nephew all entered the pit to attempt an extraction. In less than 20 minutes, the farmer, his two sons, his grandson and his nephew all lay motionless inside the putrid environment.

Why then would researchers look at this highly-toxic compound with the offensive odor for some form of health benefit? It may be because, as discussed above, what doesn’t kill you can make you stronger. The team found that hydrogen sulfide, in just the right tiny dosage, can help to fend off life-altering conditions, from diabetes to stroke, heart attacks and dementia. They believe the right dosage, which they designed and produced in a novel compound, could be instrumental in developing future therapies.

Early work with their new compound has shown that it actively protects the powerhouse of the cell – mitochondria. Mitochondria – which regulate inflammation and determine whether a cell lives or dies – drive energy production in blood vessel cells. If the compound can prevent or even reverse mitochondrial damage in these cells, the team believes their creation will be sought as a primary treatment for patients suffering from heart failure, stroke and diabetes, as well as for those diagnosed with arthritis and dementia. Dysfunctional mitochondria have already been linked as a factor in the severity of diseases.

“When cells becomes stressed by disease,” noted professor Matt Whiteman of the University of Exeter Medical School, “they draw in enzymes to generate minute quantities of hydrogen sulfide. This keeps the mitochondria ticking over and allows cells to live.” Continuing he stated, “If this doesn’t happen, the cells die and lose the ability to regulate survival and control inflammation. We have exploited this natural process by making a compound, called AP39, which slowly delivers very small amounts of this gas specifically to the mitochondria. Our results indicate that if stressed cells are treated with AP39, mitochondria are protected and cells stay alive.”

Echoing Whiteman’s sentiment, Dr. Mark Wood of Biosciences at the university stated, “Although hydrogen sulfide is well known as a pungent, foul-smelling gas in rotten eggs and flatulence, it is naturally produced in the body and could, in fact, be a healthcare hero with significant implications for future therapies for a variety of diseases.”

While the research has yet to progress to human trials, professors Whiteman and Wood are hopeful their early results will hasten that process. So far, after looking at several models of disease, the pre-clinical results are promising. Their work with regard to cardiovascular disease shows that greater than 80 percent of the mitochondrial cells are able to survive in otherwise hostile and highly destructive conditions when the AP39 compound is administered.

Additionally, small-scale studies presented last month at the 3rd International Conference on Hydrogen Sulfide in Biology and Medicine in Kyoto, Japan showed benefits of the compound in treating high blood pressure. AP39 was able to reverse blood vessel stiffening which aided in lowering blood pressure. Also, post-heart attack, the compound has been effective at helping to slow the contractions of the heart, improving its efficiency.

The University of Exeter study appears in the journal Medicinal Chemistry Communications. Researchers from the University of Texas Medical Branch have published their own work with the Exeter compound which shows AP39 selectively protects mitochondrial DNA in mitochondria. Protecting that DNA is crucial because, once damaged, it is unable to be repaired and thus leaves the individual far more vulnerable to disease symptoms. The University of Texas research follow-up study was published in The Nitric Oxide Journal.

Hydrogen sulfide, which has previously proven its unpleasant and even lethal qualities could, in fact, be one of the more important cure-alls to come along in some time. The broad applications in cardiovascular health and for the regulation of inflammation along with the clinical success already enjoyed by the AP39 compound will quite possibly improve the health outcomes for those afflicted with anything from high blood pressure to dementia. So, it would appear, in this one instance at least, Ms. Clarkson certainly knew of what she sang.

photo credit: ficknoster via photopin cc
(Originally published at redOrbit.com)

Making Medical Data Better Data

large_2956265250In just over a week, the team here at SiliconANGLE will be setting off for Cambridge, Massachusetts for the 7th Annual MIT Chief Data Officer and Information Quality Symposium held on the MIT campus. The theme this year is ‘Big Data Demands Good Data’. In preparation for the event, we will be presenting a synopsis of some of the important topics scheduled to be covered in conjunction with other academic and real-world applications that apply.

One of the first sessions, to be presented by Dr. David Levine, Vice President of Informatics/Medical Director of Comparative Data and Informatics for United Healthcare, will address the need for improving risk models based on predictive analytics.  According to the abstract of Dr. Levine’s presentation, he will discuss how the advent of and improvements in data collection can be utilized to drive improved performance in health administration and patient care.

Writing recently for FierceHealthIT, Dan Bowman discussed how a recent survey of hospital CIOs found big data in the medical field was severely lacking in its efficacy. Bowman cites a healthsystemCIO.comsurvey in which 76 percent of respondents claimed their vendors were too often over-promising and under-delivering with their big data solutions.

In the same survey, it was found a full 52 percent of respondents admitting to not using their big data applications at anything approaching any level of sophistication. This figure is supported by the fact two-thirds of those who took the survey claimed their organization had neither the manpower nor the skill to take advantage of analytical tools at a high level. Each of these factor into one CIO stating the big data market, as it pertains to healthcare, isn’t likely to reach maturity for at least a few more years.

The challenge, it appears, is for the healthcare field to better understand the mountains of data they collect and learn how to better pore over it to help in establishing better predictive models for patient care and hospital administration. According to Chris Belmont, CIO at New Orleans’ Ochsner Health System, “We have the data points. We just have to do a better job of getting our hands around the data and understanding it better.”

What is the real-world benefit of achieving an optimal understanding and usage of data in the medical field though? Jeff Kelly of Wikibon.org discussed the necessity of improving the collection and analysis of big data for those fields represented in the Industrial Internet in a posting entitled ‘The Industrial Internet and Big Data Analytics: Opportunities and Challenges’. Shockingly, he stated as much as 43 percent of an estimated $2.75 trillion in healthcare spending (for 2012 alone) was applied to unnecessary procedures and administrative waste. That abhorrent figure will be significantly reduced as the Industrial Internet is increasingly utilized by hospital CIOs to target administrative inefficiencies, eliminate waste and improve patient outcomes.

Dr. Levine’s presentation seems to aim at addressing the better utilization of data for improved patient outcomes. His work with United Healthcare, in tandem with their membership organizations, is striving to make predictive models more meaningful and actionable with the aim toward driving improved performance.

photo credit: Funky64 (www.lucarossato.com) via photopin cc

(Originally published at SiliconANGLE.com)