Posts Tagged ‘Paul Ormonde-James’


In-memory computing is making its way out of R&D labs and into the enterprise, enabling real-time processing and intelligence

The massive explosion in data volumes collected by many organisations has brought with it an accompanying headache in terms of putting it to gainful use.

Businesses increasingly need to make quick decisions, and pressure is mounting on IT departments to provide solutions that deliver quality data much faster than has been possible before. The days of trapping information in a data warehouse for retrospective analysis are fading in favour of event-driven systems that can provide data and enable decisions in real time.

Indeed, real-time computing is a new catch cry across the technology industry. A hypothetical example is a retailer that can monitor a customer’s real-time behaviour in store or on a website and draw on historical data from their loyalty system regarding spending patterns to make them offers that they might respond to in that moment.

Such a scenario has long been dreamed of, but it is being made possible today for retailers and other industries thanks in part to a technology known as in-memory computing.

In-memory computing works by bringing data physically closer to the central processing unit.

Chip manufacturers have been on this path for some time with the integration of Level 1 and Level 2 caching into microprocessors, as moving indices into Level 1 cache makes them more quickly accessible. Moving out through the caching levels usually results in a loss of speed but an increase in the size of data storage.

In-memory technology follows the same principals and moves data off disks and into main memory, eliminating the need to run a disk-seek operation each time a data look-up is performed, significantly boosting performance.

The idea of running databases in memory is nothing new, and was one of the foundations of the business intelligence product QlikView, released by QlikTech way back in 1997. More recently other technology companies have jumped on the bandwagon, notably SAP and TIBCO. What is making in-memory so popular now is that plunging memory prices have made it economical for a wider range of applications.

Gartner’s managing vice-president for business intelligence and data management, Ian Bertram, says the key application for in-memory technology today remains business intelligence, where it enables data to be conducted on the fly, with accompanying faster refreshing.

“It’s creating a data warehouse in-memory, which is a much faster technology than doing stuff on disk,” Bertram says. “Disk clearly has a much greater capacity, but it is slower. In memory, it is instantaneous. “The value comes in for people who have to make decisions really quickly, as they don’t have to wait 12 hours for an OLAP [online analytical processing] cube to be built anymore.”

As a pioneer of in-memory technology, QlikView is used by a range of Australian companies for business intelligence, including the packaging company Amcor, which is using it to make better decisions on the profitability of its delivery operations.

QlikTech’s senior vice president of products, Anthony Deighton, claims the in-memory architecture has enabled his company to build a product that focuses on ease of use.

“When users interact with our product, they can click and find their own path through the data, and do that interactively and at high speed,” he says.

The marketplace for in-memory products is rapidly becoming crowded. SAP co-founder, Hasso Plattner, has been driving the development of in-memory computing through the eponymous Institute that he founded at the University of Potsdam in 1998. Its efforts were first unveiled at SAP’s annual SAPPHIRE conference in 2009. This year, SAP announced the High-performance Analytical Appliance (HANA) project as a roadmap for in-memory computing.

About 20 companies were invited to join up to the trial, but more than 40 have now done so, and SAP will be selling in-memory appliances by the end of 2010.

SAP Australia and New Zealand products and solutions group director, John Goldrick, says that in-memory data look-up times are further reduced through the data being stored in a column format (as is the case with SAP’s Sybase IQ database) rather than rows, which means that for many operations only the contents of one column need to be read, not the entire table.

As each column is comprised of records of the same data type and size, the databases can be efficiently compressed. According to Goldrick, about 40 per cent of data storage is no longer required, along with about 80 per cent of data look-up activity. In one instance, he says, a 1.8TB database was compressed down to just 70GB.

“All of a sudden you are moving down to being able to use raw data faster than you could ever use the aggregated data, so the whole processing time becomes faster,” Goldrick says. “We did a study and worked out that we could hold the entire system of all but four of our largest customers on one blade, and get the processing speed to be 10,000 times faster than it currently is from disk.”

Most in-memory BI systems today draw data from of an existing source such as a data warehouse. Reporting tools can then be pointed at the in-memory database to generate reports.

 

Transactional reporting

 

The next step is to move transactional systems to in-memory, where it is possible to update and analyse transaction information in real time. In June, TIBCO released a series of new in-memory products for high-speed data management, the ActiveSpaces Suite, to provide fast shared memory for distributed applications to more quickly exchange and process real-time data.

Chairman and chief executive, Vivek Ranadivé, says it has immediate application in fields where there are high numbers of transactions, such as airlines, banking or utility smart grids.

“You can analyse things as they happen in your transaction system in real time,” Ranadivé says. “There is no extracting and translating the data into your data warehouse to run your analytics.”

His vision is to enable corporations to evolve from being transaction-driven business to become event-driven, where events within the organisation can trigger actions in real time based on existing data; a process he describes this as the ‘two-second advantage’.

“The two-second advantage is about having a little bit of the right information in the right context just a little beforehand — whether it is two seconds, two hours or even two days,” says Ranadivé. “By sensing what is happening around them, businesses can constantly adjust and react a little ahead of the competition or in anticipation of a customer’s wants and needs.”

Underlying this is in-memory architecture, which he says offers fast, shared memory to accelerate the exchange and processing of real-time data and events. Hence, in-memory technology is generally spoken of in the same breath as the broader movement towards real-time information processing system

 


Good article on the changing appliance markei

 

……………………………………………………………………………………………………….

One year after announcing a $250 million, three-year pact to deliver next-generation data center technology, Hewlett-Packard Co. and Microsoft today unveiled five appliances that offer Exchange and SQL Server in turnkey configurations.

The two companies’ offerings range from a so-called private cloud in a box designed to virtualize databases to a $2 million data warehouse appliance. The latter marks the official release of Microsoft’s largest and most scalable version of its database to date, SQL Server 2008 R2 Parallel Warehouse edition, code-named Project Madison.

 

“We’re paying off on the announcement we made a year ago,” said Doug Small, HP’s director of infrastructure to applications business.

The new HP E5000 Messaging System for Microsoft Exchange Server 2010 is the first time Redmond’s core e-mail platform is available in a turnkey configuration. The companies say the system can be deployed in a matter of hours. Small emphasized the system’s design around high availability. “The solutions have high availability directly built into it to ensure secure communication across the whole messaging system,” he said.

A system designed for mid-sized business will support 500 mailboxes, each at 1 GB, with an enterprise version designed to accommodate 3,000 mailboxes at 2 GB apiece. Organizations can cascade multiple appliances to scale them out, Small said. The Exchange appliances, due to ship in March, will start at $35,000 each just for the hardware, software licensing is separate.

The other appliances are centered around SQL Server: they include the HP Business Decision Appliance, the Business Data Warehouse Appliance and the HP Database Consolidation Appliance.

The Business Decision Appliance is aimed at rapid provision of BI implementations, said Fausto Ibarra, Microsoft’s senior director of BI. “It’s designed as a self-service BI solution, with everything you need, software and hardware, to empower end users to analyze large amounts of data from any number of data sources,” he said.

Microsoft said the appliance is optimized for running both SQL Server and SharePoint and can be installed and configured in less than one hour. It also runs the PowerPivot data analysis tool for Excel and provides a single dashboard for auditing, monitoring and managing BI solutions running on the appliance. Pricing starts at $28,000 for the hardware only. It is available now.

On the high-end of the spectrum, HP today began shipping the Enterprise Data Warehouse Appliance, announced back in November, which the companies say is 10 times more scalable than previous SQL Server deployments and capable of delivering queries 200 times faster.

For those looking to build smaller data warehouses based on SQL Server,  HP announced the Business Data Warehouse Appliance, which is a scaled down version aimed at small and medium businesses, also built on SQL Server 2008 R2 Parallel Data Warehouse edition.

“The Enterprise Data Warehouse Appliance is targeted at the high-end data warehousing scenarios, but there’s also many cases where companies are looking for an entry level data warehouse or a data mart that compliments their enterprise data warehouse,” Ibarra said. “The HP Business Data Warehouse Appliance is aimed at smaller data warehousing deployments. Also it’s very complementary to the Enterprise Data Warehouse. It works together but it’s independent.” The smaller Business Data Warehouse Appliance will be available in June, Microsoft said. Pricing was not disclosed.

The last of the five is the HP Database Consolidation Appliance aimed at integrating hundreds or even thousands of databases into a private cloud environment. The system is optimized for SQL Server 2008R2 and Hyper-V Cloud which is designed for rapid deployment, Microsoft said.

“Nowadays most companies are looking to build private clouds to get the benefits of public clouds, get them into their own data center,” Ibarra said. “Essentially [that means] having a private cloud solution which can enable them to consolidate transaction applications, sharing pools of resources and making it very easy to allocate capacity.” The HP Database Consolidation Appliance will be available in the second half of this year. The company did not disclose pricing.

While the new appliances will be available direct and through any HP or Microsoft channel partners, both companies are emphasizing customers would be best served acquiring them from partners engaged in the companies’ joint Frontline Channel Partner program.

“We’re not constraining the product availability, we have a specific program that we’ve developed and funded as part of this initiative that has the targeted training and other resources available,” Small said. “So it will be in a partner’s best interest to sign up for either the SQL or Exchange subset of the Frontline partner program to get access to the program.”

 


Data warehouse technology from innovative leader pays big, tangible dividends for customers

DAYTON, Ohio, Jan. 14, 2011 /PRNewswire/ — Teradata Corporation (NYSE: TDC) today announced that it has been recognized for its technology domination in data warehouse landscape by a new study from the Information Difference research firm. The research report further validates Teradata’s leadership as the world’s largest company solely focused on data warehousing and enterprise analytics.

(Logo:  http://photos.prnewswire.com/prnh/20090909/TERADATALOGO )

“As a pioneer in the data warehousing market, Teradata has proven its ability to deliver a foundation for a wide range of business analytics, which supports both departmental and enterprise intelligence for companies of all sizes.  Teradata’s reference customers were some of the most satisfied in our survey, and we ranked Teradata highest for technology of all vendors,” said Andy Hayler, chief executive officer and co-founder of The Information Difference. “This performance and capability pays big, tangible dividends for Teradata customers by helping them to out-maneuver their competitors.”

“It is gratifying that Teradata was recognized, and this validates our strategy of helping customers achieve success through the use of business analytics. To support our customers, Teradata offers a complete platform family designed to perform in today’s toughest environments and address all manner of analytic requirements,” said Scott Gnau, head of development, Teradata Corporation.  “Leveraging our unique and unmatched architecture, the Teradata platform family not only offers market-leading capabilities, but delivers the earliest adoption of breakthrough technologies like flash storage, multi-core and in-memory processing, all in a seamless environment.”

The technology dimension position is derived from a weighted set of scores based on four factors: customer satisfaction as measured by a survey of reference customers, analyst impressions of the technology, maturity of the technology and breath of technology in terms of coverage against the Information Difference functionality model.


A relatively small number of the organizations that responded to the 2010 SearchBusinessAnalytics.com survey were using predictive analytics tools – just 16%. But 48% said that they planned to add predictive analytics software within the next 12 months, giving it the top spot on the analytics technology adoption list. Industry analysts also see predictive analytics as the next big battleground for BI vendors, which increasingly are developing or acquiring predictive analytics technology with the goal of incorporating it into their core platforms. In October, for example, IBM announced a new version of its Cognos BI software with predictive analytics capabilities built in. Thus far, many of the early adopters of predictive analytics are focusing not on wider market and economic trends but on individual customer analysis in an effort to understand what specific customers are likely to buy so that marketing campaigns and up-sell offers can be tailored to them.


The World is getting smaller. We have seen the advent of social collabotation where the younger community members now use the internet to communicate with each other far more than the email systems of the past. Just as email outplaced paper mail (snail mail) so is the growing trend in collaboration and social media replacing current methodologies. Companies are looking hard at this phenomena with mixed feelings. The potential “open” nature of views, comments and opinions is quite different to the potential one on one communications of email. But is this new revolution culturally acceptable to corporations?

There are many aspects to the new paradigm.

  • People must be comfortable in expressing views or comments that can be “judged” or “commented” upon in an open way.
  • People may be judged by what they say and have no control of who sees their opinions
  • Politically astute staff may choose to not comment for fear of reprisal or being “pigeon boxed” for opinions
  • An organization may not want certain topics commented upon
  • Could staff use the forums for social use and there be no real commercial value?
  • Will all levels of the organization really contribute or will it be the few distributing comments on issues for the many?
  • How do you measure value?

The issues are not as clear as they seem, but one application that can add value whilst allowing knowledgeable workers to contribute is the application of collaborative analytics.

The concept is simple. In the open collaboration platforms you post key reports for the business. Analysts or even interested parties  can openly comment on what they see. The normal analytic process has the analyst producing  a report and then commenting upon what they interpret to be insight. The report and/or analysis is then  provided to the manager for review. The review process becomes siloed or isolated with interpretation based on the eyes of the reviewer. They will see only what their experience allows them to see and may limit the full understanding of the information. The analysis may not even be on all the available information to address the issue.

I have been actively promoting this concept in the organization and have seen the benefits it can bring over the “normal” approach. It is a new era of analytics and we must embrace this. It is time for us to truly share our analytic work and to harness the power of individual thought into cross boundary excellence and understanding. Colaborative analytics is here to stay, it really works.


Today ther term business intelligence is tainted. Many believe it is something from canned reports to the whole universe of intelligence. Waht is clear is that no matter what you want to call it organisations are getting smarter.

Some choose to use the term “Decision Intelligence” which reflects the need for information to translate to decisions or actions. A great deal of “business Intelligence” is not helpful to the business nor driven by the business, maybe it should be called IT Intelligence.

I suggest that the value is delivered from understanding what the business would like to decide, when and with what degree of understanding. Knowing this we can then deliver “Intelligence” or Information with context, to the user and decider. Quite often the organisation misses this fundamental point. Canned reports may tell you what but you need some granularity to tell you “why” (insight) and more trusted information to tell you what could happen (foresight).

The bottom line is no matter what you call it, if people are not using it to make decisions, what good it it?

Paul OJ


Welcome to WordPress.com. This is your first post. Edit or delete it and start blogging!