Archive for the ‘Analytics’ Category


http://www.toptechnews.com/story.xhtml?story_id=131008O5BA6H&page=2


In-memory computing is making its way out of R&D labs and into the enterprise, enabling real-time processing and intelligence

The massive explosion in data volumes collected by many organisations has brought with it an accompanying headache in terms of putting it to gainful use.

Businesses increasingly need to make quick decisions, and pressure is mounting on IT departments to provide solutions that deliver quality data much faster than has been possible before. The days of trapping information in a data warehouse for retrospective analysis are fading in favour of event-driven systems that can provide data and enable decisions in real time.

Indeed, real-time computing is a new catch cry across the technology industry. A hypothetical example is a retailer that can monitor a customer’s real-time behaviour in store or on a website and draw on historical data from their loyalty system regarding spending patterns to make them offers that they might respond to in that moment.

Such a scenario has long been dreamed of, but it is being made possible today for retailers and other industries thanks in part to a technology known as in-memory computing.

In-memory computing works by bringing data physically closer to the central processing unit.

Chip manufacturers have been on this path for some time with the integration of Level 1 and Level 2 caching into microprocessors, as moving indices into Level 1 cache makes them more quickly accessible. Moving out through the caching levels usually results in a loss of speed but an increase in the size of data storage.

In-memory technology follows the same principals and moves data off disks and into main memory, eliminating the need to run a disk-seek operation each time a data look-up is performed, significantly boosting performance.

The idea of running databases in memory is nothing new, and was one of the foundations of the business intelligence product QlikView, released by QlikTech way back in 1997. More recently other technology companies have jumped on the bandwagon, notably SAP and TIBCO. What is making in-memory so popular now is that plunging memory prices have made it economical for a wider range of applications.

Gartner’s managing vice-president for business intelligence and data management, Ian Bertram, says the key application for in-memory technology today remains business intelligence, where it enables data to be conducted on the fly, with accompanying faster refreshing.

“It’s creating a data warehouse in-memory, which is a much faster technology than doing stuff on disk,” Bertram says. “Disk clearly has a much greater capacity, but it is slower. In memory, it is instantaneous. “The value comes in for people who have to make decisions really quickly, as they don’t have to wait 12 hours for an OLAP [online analytical processing] cube to be built anymore.”

As a pioneer of in-memory technology, QlikView is used by a range of Australian companies for business intelligence, including the packaging company Amcor, which is using it to make better decisions on the profitability of its delivery operations.

QlikTech’s senior vice president of products, Anthony Deighton, claims the in-memory architecture has enabled his company to build a product that focuses on ease of use.

“When users interact with our product, they can click and find their own path through the data, and do that interactively and at high speed,” he says.

The marketplace for in-memory products is rapidly becoming crowded. SAP co-founder, Hasso Plattner, has been driving the development of in-memory computing through the eponymous Institute that he founded at the University of Potsdam in 1998. Its efforts were first unveiled at SAP’s annual SAPPHIRE conference in 2009. This year, SAP announced the High-performance Analytical Appliance (HANA) project as a roadmap for in-memory computing.

About 20 companies were invited to join up to the trial, but more than 40 have now done so, and SAP will be selling in-memory appliances by the end of 2010.

SAP Australia and New Zealand products and solutions group director, John Goldrick, says that in-memory data look-up times are further reduced through the data being stored in a column format (as is the case with SAP’s Sybase IQ database) rather than rows, which means that for many operations only the contents of one column need to be read, not the entire table.

As each column is comprised of records of the same data type and size, the databases can be efficiently compressed. According to Goldrick, about 40 per cent of data storage is no longer required, along with about 80 per cent of data look-up activity. In one instance, he says, a 1.8TB database was compressed down to just 70GB.

“All of a sudden you are moving down to being able to use raw data faster than you could ever use the aggregated data, so the whole processing time becomes faster,” Goldrick says. “We did a study and worked out that we could hold the entire system of all but four of our largest customers on one blade, and get the processing speed to be 10,000 times faster than it currently is from disk.”

Most in-memory BI systems today draw data from of an existing source such as a data warehouse. Reporting tools can then be pointed at the in-memory database to generate reports.

 

Transactional reporting

 

The next step is to move transactional systems to in-memory, where it is possible to update and analyse transaction information in real time. In June, TIBCO released a series of new in-memory products for high-speed data management, the ActiveSpaces Suite, to provide fast shared memory for distributed applications to more quickly exchange and process real-time data.

Chairman and chief executive, Vivek Ranadivé, says it has immediate application in fields where there are high numbers of transactions, such as airlines, banking or utility smart grids.

“You can analyse things as they happen in your transaction system in real time,” Ranadivé says. “There is no extracting and translating the data into your data warehouse to run your analytics.”

His vision is to enable corporations to evolve from being transaction-driven business to become event-driven, where events within the organisation can trigger actions in real time based on existing data; a process he describes this as the ‘two-second advantage’.

“The two-second advantage is about having a little bit of the right information in the right context just a little beforehand — whether it is two seconds, two hours or even two days,” says Ranadivé. “By sensing what is happening around them, businesses can constantly adjust and react a little ahead of the competition or in anticipation of a customer’s wants and needs.”

Underlying this is in-memory architecture, which he says offers fast, shared memory to accelerate the exchange and processing of real-time data and events. Hence, in-memory technology is generally spoken of in the same breath as the broader movement towards real-time information processing system

 


The world of analytics has changed, now top organisations can harness the power of multiple views and opinions from within. Analytics has traditionally been more one to one relationships. Someone requests a piece of analytics and an analyst creates and prepares the work. This is then presented back for consideration and action.

Now with the deployment of social media, analytics can be created and then multiple analysts can comment. The multiple “eyes” or “views” give greater insight than just one individual with their own skills and bias. Now managers can see various aspects of the analytics from varying points of view. It may be that all people see the same and recommendations are the same, but it may be that some arying aspects are revealed that were not considered. This was the approach I took at the World Bank to assure we had all views harnessed from many sources. It was amazing how many varying views can forward from the same data when applied in many various geographies and departments. This is the new world of analytics and can be harnessed if doen the right way.

I saw the value in this approach and now am showing other organisations how to harness this amazing power. This approach allows “Cognative Variance” to be applied to give a richness of understanding and the harnessing of corporate knowledge. It can also be used between organisations in an industry. It is a little like spreads theory, where you harness the power of many individuals thought to show a bias or trend.

Analytics id evolving, and the next step is collaborative analytics.

Get in early, and gain the first to market advantage.

Paul Ormonde-James

 


A great article on the power of data to make you look good, even if you are really “cheating”. As they state anyone can be a true visionary!

Why should the smart people have all the fun with year-end predictions? You can issue your own! At this time of year, even hopeless nitwits can seem smart.

Once you set up a blog — any free service will do — all you have to do is throw together your trends. Keep these easy-to-use techniques in mind.

• Re-use last year’s trends. Does anyone really believe that 2010′s trends sat down in December for a cosmo and never stood up again? You can safely predict that this year’s trends will be next year’s, too.

• Search in Google for your industry’s name and “trends.” Take notes, rewrite a little bit and, boom, you’re an expert.

• Water the evergreens. For 2009, someone predicted, “Data interpretation will become a significant challenge for new BI users.” Will become? Can you imagine fewer business people having trouble interpreting data no matter what year it is?

• Follow in the draft of top vendors. Competition cyclists know that the easiest place to ride is just inches behind another rider. See where Oracle, IBM say they’re going and point in that direction. If a gang of marketing departments push an idea, it’s guaranteed to find at least a few new customers.

• Quantifying is risky but, done cleverly, it adds credibility. Just make sure your numbers can’t be verified. One clever expert sees 15 chiefs of analytics being hired in 2011. Bingo! The mere presence of a number, any number, gives the feel of certainty. Even if someone wanted to count, how would they do it?

• It’s good to be vague, but better to be incomprehensible.Suppose your crystal ball shows video becoming a big deal in 2011 (as if it weren’t already). Don’t just write “video,” as one hapless analyst did. Instead, pile on enough mumbo jumbo to let readers feel smart for having understood anything at all. Those who’ve tried to read 50 or 100 words will tweet about your “great” predictions.

• Aim for the horizon. Don’t let yourself be bound by others’ definition of “year.” If your vision fails to come true in 2011, you’re just that much further ahead of your time.

Above all, you must enter to win. After the first weeks of January, normal standards set in. If you feel like a fraud, remember that last week’s predictions are like last night’s eggnog. All people remember is the party, and all your readers will remember is your name.

Datadoodle.com  http://datadoodle.com/2010/12/27/impress-your-colleagues-with-year-end-predictions/?owa_source=feed&owa_sid&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed:+Datadoodle+(datadoodle)



Go to any BI or data warehousing conference and you’ll likely hear about the evils and data management disasters that come with all of the Excel-based “spreadmarts” that business users refuse to let go of. In fact, you might think that Excel is akin to the bubonic plague – and for a lot of businesses with poor spreadsheet management practices, you might be right. But according to Gartner analysts and attendees at the firm’s annual Business Intelligence Summit, it’s time for IT and BI managers to wave the white flag on using Excel for BI purposes. Their advice: Make your peace with spreadsheets and focus on developing processes for properly using Excel in BI projects. That was music to Microsoft’s ears, of course. Hoping to further capitalize on Excel’s continuing BI popularity, the software vendor released a PowerPivot for Excel add-in that lets end users integrate nearly unlimited amounts of data into their spreadsheets for analysis – although it also added a SharePoint version with management capabilities designed to help ease the collective minds of IT groups.


A relatively small number of the organizations that responded to the 2010 SearchBusinessAnalytics.com survey were using predictive analytics tools – just 16%. But 48% said that they planned to add predictive analytics software within the next 12 months, giving it the top spot on the analytics technology adoption list. Industry analysts also see predictive analytics as the next big battleground for BI vendors, which increasingly are developing or acquiring predictive analytics technology with the goal of incorporating it into their core platforms. In October, for example, IBM announced a new version of its Cognos BI software with predictive analytics capabilities built in. Thus far, many of the early adopters of predictive analytics are focusing not on wider market and economic trends but on individual customer analysis in an effort to understand what specific customers are likely to buy so that marketing campaigns and up-sell offers can be tailored to them.


The World is getting smaller. We have seen the advent of social collabotation where the younger community members now use the internet to communicate with each other far more than the email systems of the past. Just as email outplaced paper mail (snail mail) so is the growing trend in collaboration and social media replacing current methodologies. Companies are looking hard at this phenomena with mixed feelings. The potential “open” nature of views, comments and opinions is quite different to the potential one on one communications of email. But is this new revolution culturally acceptable to corporations?

There are many aspects to the new paradigm.

  • People must be comfortable in expressing views or comments that can be “judged” or “commented” upon in an open way.
  • People may be judged by what they say and have no control of who sees their opinions
  • Politically astute staff may choose to not comment for fear of reprisal or being “pigeon boxed” for opinions
  • An organization may not want certain topics commented upon
  • Could staff use the forums for social use and there be no real commercial value?
  • Will all levels of the organization really contribute or will it be the few distributing comments on issues for the many?
  • How do you measure value?

The issues are not as clear as they seem, but one application that can add value whilst allowing knowledgeable workers to contribute is the application of collaborative analytics.

The concept is simple. In the open collaboration platforms you post key reports for the business. Analysts or even interested parties  can openly comment on what they see. The normal analytic process has the analyst producing  a report and then commenting upon what they interpret to be insight. The report and/or analysis is then  provided to the manager for review. The review process becomes siloed or isolated with interpretation based on the eyes of the reviewer. They will see only what their experience allows them to see and may limit the full understanding of the information. The analysis may not even be on all the available information to address the issue.

I have been actively promoting this concept in the organization and have seen the benefits it can bring over the “normal” approach. It is a new era of analytics and we must embrace this. It is time for us to truly share our analytic work and to harness the power of individual thought into cross boundary excellence and understanding. Colaborative analytics is here to stay, it really works.