Archive for the ‘Data Warehouse’ Category


http://www.cio.com.au/article/387400/ibm_invest_100_million_big-data_analysis_research/

The information in this e-mail message and any files transmitted with it are intended to be confidential and for the use of only the individual or entity to whom they are addressed. The message and files may be protected by legal professional privilege, or other legal rules. The confidentiality of and privilege applying to this message and files is not waived if this message or files has been sent to you by mistake. If the reader of this message or files is not the intended recipient, you are notified that retention, distribution or copying of this message and files are strictly prohibited. If you receive this message or files in error, please notify us immediately by telephone or return e-mail and delete all copies from your computer system. It is the recipient’s responsibility to check this message and files for viruses.
Thank you.

Advertisements

In-memory computing is making its way out of R&D labs and into the enterprise, enabling real-time processing and intelligence

The massive explosion in data volumes collected by many organisations has brought with it an accompanying headache in terms of putting it to gainful use.

Businesses increasingly need to make quick decisions, and pressure is mounting on IT departments to provide solutions that deliver quality data much faster than has been possible before. The days of trapping information in a data warehouse for retrospective analysis are fading in favour of event-driven systems that can provide data and enable decisions in real time.

Indeed, real-time computing is a new catch cry across the technology industry. A hypothetical example is a retailer that can monitor a customer’s real-time behaviour in store or on a website and draw on historical data from their loyalty system regarding spending patterns to make them offers that they might respond to in that moment.

Such a scenario has long been dreamed of, but it is being made possible today for retailers and other industries thanks in part to a technology known as in-memory computing.

In-memory computing works by bringing data physically closer to the central processing unit.

Chip manufacturers have been on this path for some time with the integration of Level 1 and Level 2 caching into microprocessors, as moving indices into Level 1 cache makes them more quickly accessible. Moving out through the caching levels usually results in a loss of speed but an increase in the size of data storage.

In-memory technology follows the same principals and moves data off disks and into main memory, eliminating the need to run a disk-seek operation each time a data look-up is performed, significantly boosting performance.

The idea of running databases in memory is nothing new, and was one of the foundations of the business intelligence product QlikView, released by QlikTech way back in 1997. More recently other technology companies have jumped on the bandwagon, notably SAP and TIBCO. What is making in-memory so popular now is that plunging memory prices have made it economical for a wider range of applications.

Gartner’s managing vice-president for business intelligence and data management, Ian Bertram, says the key application for in-memory technology today remains business intelligence, where it enables data to be conducted on the fly, with accompanying faster refreshing.

“It’s creating a data warehouse in-memory, which is a much faster technology than doing stuff on disk,” Bertram says. “Disk clearly has a much greater capacity, but it is slower. In memory, it is instantaneous. “The value comes in for people who have to make decisions really quickly, as they don’t have to wait 12 hours for an OLAP [online analytical processing] cube to be built anymore.”

As a pioneer of in-memory technology, QlikView is used by a range of Australian companies for business intelligence, including the packaging company Amcor, which is using it to make better decisions on the profitability of its delivery operations.

QlikTech’s senior vice president of products, Anthony Deighton, claims the in-memory architecture has enabled his company to build a product that focuses on ease of use.

“When users interact with our product, they can click and find their own path through the data, and do that interactively and at high speed,” he says.

The marketplace for in-memory products is rapidly becoming crowded. SAP co-founder, Hasso Plattner, has been driving the development of in-memory computing through the eponymous Institute that he founded at the University of Potsdam in 1998. Its efforts were first unveiled at SAP’s annual SAPPHIRE conference in 2009. This year, SAP announced the High-performance Analytical Appliance (HANA) project as a roadmap for in-memory computing.

About 20 companies were invited to join up to the trial, but more than 40 have now done so, and SAP will be selling in-memory appliances by the end of 2010.

SAP Australia and New Zealand products and solutions group director, John Goldrick, says that in-memory data look-up times are further reduced through the data being stored in a column format (as is the case with SAP’s Sybase IQ database) rather than rows, which means that for many operations only the contents of one column need to be read, not the entire table.

As each column is comprised of records of the same data type and size, the databases can be efficiently compressed. According to Goldrick, about 40 per cent of data storage is no longer required, along with about 80 per cent of data look-up activity. In one instance, he says, a 1.8TB database was compressed down to just 70GB.

“All of a sudden you are moving down to being able to use raw data faster than you could ever use the aggregated data, so the whole processing time becomes faster,” Goldrick says. “We did a study and worked out that we could hold the entire system of all but four of our largest customers on one blade, and get the processing speed to be 10,000 times faster than it currently is from disk.”

Most in-memory BI systems today draw data from of an existing source such as a data warehouse. Reporting tools can then be pointed at the in-memory database to generate reports.

 

Transactional reporting

 

The next step is to move transactional systems to in-memory, where it is possible to update and analyse transaction information in real time. In June, TIBCO released a series of new in-memory products for high-speed data management, the ActiveSpaces Suite, to provide fast shared memory for distributed applications to more quickly exchange and process real-time data.

Chairman and chief executive, Vivek Ranadivé, says it has immediate application in fields where there are high numbers of transactions, such as airlines, banking or utility smart grids.

“You can analyse things as they happen in your transaction system in real time,” Ranadivé says. “There is no extracting and translating the data into your data warehouse to run your analytics.”

His vision is to enable corporations to evolve from being transaction-driven business to become event-driven, where events within the organisation can trigger actions in real time based on existing data; a process he describes this as the ‘two-second advantage’.

“The two-second advantage is about having a little bit of the right information in the right context just a little beforehand — whether it is two seconds, two hours or even two days,” says Ranadivé. “By sensing what is happening around them, businesses can constantly adjust and react a little ahead of the competition or in anticipation of a customer’s wants and needs.”

Underlying this is in-memory architecture, which he says offers fast, shared memory to accelerate the exchange and processing of real-time data and events. Hence, in-memory technology is generally spoken of in the same breath as the broader movement towards real-time information processing system

 


According to a Forrester Consulting study commissioned by Geographic Business Intelligence® software developer Alteryx®, LLC, industries such as retail, real estate, banking and telecommunications are moving quickly toward single-vendor BI solutions and pre-packaged content, converging business processes toward a new generation of best-practice decision-making.

Orange, CA (PRWEB) January 19, 2011

Recent advancements in software capabilities, technologies and deployments are shifting Business Intelligence (BI) paradigms, and are changing the way companies do business.

According to a Forrester Consulting study commissioned by Geographic Business Intelligence® software developer Alteryx®, LLC, industries such as retail, real estate, banking and telecommunications are moving quickly toward single-vendor BI solutions and pre-packaged content, converging business processes toward a new generation of best-practice decision-making.

The Forrester study from October 2010, Integrated And Flexible Business Intelligence Solutions Are Vital For Better Business Decisions, identifies and outlines the state-of-the-art in BI best practices, and how convergence of end-to-end solutions and business processes are boosting enterprise performance and ROI. Flexible use scenarios are cited in various examples of how next-generation deployments are driving decisions in the cloud for convenience, and on-premise for security priorities. For a copy of the report, click on this link: http://alteryx.com/How-Alteryx-Performs/Pages/White-Papers/Forrester-Report-on-BI-Platform-Requirements.aspx

“Anywhere where it’s cost-effective and reveals proprietary data, there are benefits to a flexible deployment option such as on-premise, hosted or SaaS,” said an IT director at a leading apparel company who was interviewed during the study. “[The] question is, does anyone else have access to it?”

The findings support Forrester Research, Inc.’s Boris Evelson’s December 23, 2010, BI Maturity In The Enterprise: 2010 Update that identifies strong trends in agile BI with less centralization, and more pervasive use across wide-ranging vertical markets and industry categories. Talent and skills are in demand, as more enterprises are combining BI and spatial analytics into their traditional business processes. Evelson will share new research and address the state-of-the-art in BI strategies when he keynotes the Alteryx Business Leader Summit on March 9th at the Omni Interlocken Resort in Broomfield, Colo.

“Although BI is one of the original software application disciplines, spatial analytics, pre-packaged content and single-vendor solutions are helping to shift paradigms across multiple vertical markets in the BI software space,” said Alteryx CEO Dean Stoecker.

“More than a map in the cloud, spatial requires integrated, end-to-end solutions and an architectural platform that can cross the chasms between the desktop and the cloud,” said Stoecker. “The code base of the future is already transcending geospatial processing and moving toward solving multi-disciplined business problems, and changing the very nature of BI and its inter-relationship with the science of business process.”

The company’s latest offering, Alteryx 2011, is a seamlessly integrated BI platform delivering robust consumer profiling and behavior segmentation tools, advanced processing of spatial and non-spatial data, and a full range of BI functionality, including Spatial Processing; Extract, Transform and Load (ETL); Address Hygiene and Data Quality; BI Analytics; Reporting and Visualization; and Customer Data Integration (CDI).

Alteryx 2011 builds on the company’s proven technologies and now incorporates several enhanced BI features previously only available as standalone offerings:

  • Location Optimization for planning business network expansions and/or contractions
  • Behavioral Analytics for consumer profiling and segmentation
  • Automated Module Packaging for simple, enterprise-wide distribution
  • New Tool Suites with enhanced analytics and expanded charting options
  • New Macros that add iterative processing capabilities and additional optimization options

About Alteryx

Solving business problems for nearly a quarter-million users worldwide, Alteryx is driving the global revolution in Geographic Business Intelligence®. Through smart, extensible solutions from the desktop to the web, the Alteryx software platform delivers the fastest, most comprehensive consumer, business and market insights to the Fortune 500, mid-market companies, government and academia. Alteryx hosted, on-premise and SaaS solutions integrate spatial intelligence into enterprise workflows, seamlessly scaled across local, regional and global markets. Inspiring ingenuity since 1997, Alteryx, LLC is headquartered in Orange, Calif., with its Technology Center in Boulder, Colo. For more information please visit http://www.alteryx.com.

Alteryx and Geographic Business Intelligence are registered trademarks of Alteryx, LLC

Read more: http://www.benzinga.com/press-releases/11/01/p790898/business-intelligence-processes-converge-to-drive-next-generation-busin#ixzz1BZAMGw7x


Good article on the changing appliance markei

 

……………………………………………………………………………………………………….

One year after announcing a $250 million, three-year pact to deliver next-generation data center technology, Hewlett-Packard Co. and Microsoft today unveiled five appliances that offer Exchange and SQL Server in turnkey configurations.

The two companies’ offerings range from a so-called private cloud in a box designed to virtualize databases to a $2 million data warehouse appliance. The latter marks the official release of Microsoft’s largest and most scalable version of its database to date, SQL Server 2008 R2 Parallel Warehouse edition, code-named Project Madison.

 

“We’re paying off on the announcement we made a year ago,” said Doug Small, HP’s director of infrastructure to applications business.

The new HP E5000 Messaging System for Microsoft Exchange Server 2010 is the first time Redmond’s core e-mail platform is available in a turnkey configuration. The companies say the system can be deployed in a matter of hours. Small emphasized the system’s design around high availability. “The solutions have high availability directly built into it to ensure secure communication across the whole messaging system,” he said.

A system designed for mid-sized business will support 500 mailboxes, each at 1 GB, with an enterprise version designed to accommodate 3,000 mailboxes at 2 GB apiece. Organizations can cascade multiple appliances to scale them out, Small said. The Exchange appliances, due to ship in March, will start at $35,000 each just for the hardware, software licensing is separate.

The other appliances are centered around SQL Server: they include the HP Business Decision Appliance, the Business Data Warehouse Appliance and the HP Database Consolidation Appliance.

The Business Decision Appliance is aimed at rapid provision of BI implementations, said Fausto Ibarra, Microsoft’s senior director of BI. “It’s designed as a self-service BI solution, with everything you need, software and hardware, to empower end users to analyze large amounts of data from any number of data sources,” he said.

Microsoft said the appliance is optimized for running both SQL Server and SharePoint and can be installed and configured in less than one hour. It also runs the PowerPivot data analysis tool for Excel and provides a single dashboard for auditing, monitoring and managing BI solutions running on the appliance. Pricing starts at $28,000 for the hardware only. It is available now.

On the high-end of the spectrum, HP today began shipping the Enterprise Data Warehouse Appliance, announced back in November, which the companies say is 10 times more scalable than previous SQL Server deployments and capable of delivering queries 200 times faster.

For those looking to build smaller data warehouses based on SQL Server,  HP announced the Business Data Warehouse Appliance, which is a scaled down version aimed at small and medium businesses, also built on SQL Server 2008 R2 Parallel Data Warehouse edition.

“The Enterprise Data Warehouse Appliance is targeted at the high-end data warehousing scenarios, but there’s also many cases where companies are looking for an entry level data warehouse or a data mart that compliments their enterprise data warehouse,” Ibarra said. “The HP Business Data Warehouse Appliance is aimed at smaller data warehousing deployments. Also it’s very complementary to the Enterprise Data Warehouse. It works together but it’s independent.” The smaller Business Data Warehouse Appliance will be available in June, Microsoft said. Pricing was not disclosed.

The last of the five is the HP Database Consolidation Appliance aimed at integrating hundreds or even thousands of databases into a private cloud environment. The system is optimized for SQL Server 2008R2 and Hyper-V Cloud which is designed for rapid deployment, Microsoft said.

“Nowadays most companies are looking to build private clouds to get the benefits of public clouds, get them into their own data center,” Ibarra said. “Essentially [that means] having a private cloud solution which can enable them to consolidate transaction applications, sharing pools of resources and making it very easy to allocate capacity.” The HP Database Consolidation Appliance will be available in the second half of this year. The company did not disclose pricing.

While the new appliances will be available direct and through any HP or Microsoft channel partners, both companies are emphasizing customers would be best served acquiring them from partners engaged in the companies’ joint Frontline Channel Partner program.

“We’re not constraining the product availability, we have a specific program that we’ve developed and funded as part of this initiative that has the targeted training and other resources available,” Small said. “So it will be in a partner’s best interest to sign up for either the SQL or Exchange subset of the Frontline partner program to get access to the program.”

 


Data warehouse technology from innovative leader pays big, tangible dividends for customers

DAYTON, Ohio, Jan. 14, 2011 /PRNewswire/ — Teradata Corporation (NYSE: TDC) today announced that it has been recognized for its technology domination in data warehouse landscape by a new study from the Information Difference research firm. The research report further validates Teradata’s leadership as the world’s largest company solely focused on data warehousing and enterprise analytics.

(Logo:  http://photos.prnewswire.com/prnh/20090909/TERADATALOGO )

“As a pioneer in the data warehousing market, Teradata has proven its ability to deliver a foundation for a wide range of business analytics, which supports both departmental and enterprise intelligence for companies of all sizes.  Teradata’s reference customers were some of the most satisfied in our survey, and we ranked Teradata highest for technology of all vendors,” said Andy Hayler, chief executive officer and co-founder of The Information Difference. “This performance and capability pays big, tangible dividends for Teradata customers by helping them to out-maneuver their competitors.”

“It is gratifying that Teradata was recognized, and this validates our strategy of helping customers achieve success through the use of business analytics. To support our customers, Teradata offers a complete platform family designed to perform in today’s toughest environments and address all manner of analytic requirements,” said Scott Gnau, head of development, Teradata Corporation.  “Leveraging our unique and unmatched architecture, the Teradata platform family not only offers market-leading capabilities, but delivers the earliest adoption of breakthrough technologies like flash storage, multi-core and in-memory processing, all in a seamless environment.”

The technology dimension position is derived from a weighted set of scores based on four factors: customer satisfaction as measured by a survey of reference customers, analyst impressions of the technology, maturity of the technology and breath of technology in terms of coverage against the Information Difference functionality model.


Good article by Denise Rogers on Successful DW/BI solutions.

 

The ingredients to a successful Data Warehouse / Business Intelligence deployment include good project management, effective communication and using DW/BI tools to their full potential. Denise Rogers presents her top 10 techniques to guarantee a successful DW/BI deployment.

The ingredients to a successful Data WarehouseBusiness Intelligence deployment include good project management, effective communication and using DW/BI tools to their full potential. A Data Warehouse / Business Intelligence application is being built, NOT an OLTP application. So use the ETL solution and its features in support of the solution, the Business Intelligence software to its full potential, the DBMS and its Data Warehouse functionality. Other essential ingredients include getting the resources trained or experienced hires or both that know how to exploit the features and functionality that each of the components of the solution has to offer.

The mantra should be to design a solution based on the business requirements and the strategic direction, not to fit a DW/BI solution into an OLTP strategy for deployment that has been around for years.

The functionality of the tools should be completely understood, this is the responsibility of the team, such that all available components of the tools are alternatives to be explored as part of the design. This ensures a comprehensive understanding of the features and functions available to design a robust solution. So without further delay, I would like to introduce the top 10 ways to guarantee a successful DW/BI deployment.

Number 10 – Manage the DW/BI project…there are too many risks not to

A DW/BI project is huge undertaking for any organization. On this, we must agree. So why would any organization not want to have a formal project management approach to ensure success? The answer should be obvious. A formal process for administering the project ensures that communication, risks, scope, cost, time and quality are managed successfully and effectively, are required and essential.

Number 9 – Sponsorship…don’t leave home without it

There’s quite a bit of momentum for the DW/BI incentive. That’s great! However, without the approval and sponsorship from senior management, the project goes nowhere really fast! Executive sponsorship ensures that the project moves forward in terms that it enables the alignment of resources, approved budgets and opens the lines of communications. This is gold for the DW/BI project.

Number 8 – Documentation is the gift that keeps on giving

Documentation IS the gift that keeps on giving. Why? There are so many moving parts and so many components to the DW/BI solution. Having living and breathing documented artifacts, either using standardized documentation practices or through standards created for the project enables the DW/BI solution to become a scalable, supportable and maintainable set of applications with minimal impact on resolving issues.

Number 7 – Deploying a DW/BI solution is hard enough, don’t make it harder

Building a DW/BI solution is one of the most ambitious undertakings for any organization. There is no real need to take on unnecessary risks. One such risk is using buggy, uncertified software code to build parts of the solution. The beta version may be at a reduced cost at the onset but how much will it cost the project when it breaks during the production deployment? And it will break! Staying a release or two back is not a bad idea. These versions have most of the bugs worked out so there’s minimal risk. A DW/BI solution should be the leading edge of software not the bleeding edge!

Number 6 – Metadata is more important than you think

Metadata management and data governance is an integral part of any DW/BI solution; it’s more important than you think! Metadata not only defines and describes the data elements in the data mart, it provides a lineage or audit trail back to its source of origin. Metadata defines the purpose of the data, why it was originally created and its business owner. This is valuable information when the data in a report is being questioned during an audit and requires validation. It also reduces the time required to enhance a BI application specific to the creation of new reports.

Number 5 – A spreadsheet does not a UI make

When there has been a lot of effort invested in cleansing, transforming and loading data into a DW/BI environment and there is a high demand to report on the data, give the business a reporting solution that really speaks to their requirements. A spreadsheet is good for reporting in certain situations but if there is an enterprise BI solution with the features and functionality that the business community is asking for, it is the job of the IT team to create an application that actually uses all that the BI solution is capable of! Remember the BI reporting solution is not just a spreadsheet; the evaluation was exhaustive, all features and functionality were tested, why only use 10% of the functionality to deploy the DW/BI solution.

Number 4 – Upgrade the infrastructure…it makes DW/BI solution work…really

A DW/BI application needs to have an infrastructure that supports massive amounts of data to be moved, transformed, cleansed, loaded, stored and reported on. To ensure success of a DW/BI initiative, it is essential that the project plan includes tasks that assess and upgrade the infrastructure. A DW/BI requires lots of storage and lots of memory; make sure the platform is ready to host the solution.

Number 3 – ETL can do a lot of things…and it should

An ETL solution does so much more than extract and load data. A complete ETL solution includes data profiling, standardization, cleansing and transforming. So design an ETL solution that uses its features; make the ETL software work to 100% of its design not 10%…..ETL software is NOT just a data mover!

Number 2 – Put the Database to work

Oracle, UDB and SQL Server (to name the big 3) have come a long way in the DW/BI space. Get all the components working to support a DW/BI solution not just part of the solution; all of it! The DBMS has quite a few DW/BI features that offer star schemas with aggregate functions, data cubes with all the features that support drill through. The data mart should not be in 3rd normal form with views that supports complex joins.

Number 1 – Get the right resources in place to do the job

Some of the DW/BI biggest successes and failures are often linked to how the project has been resourced. For organizations that are embarking on their first DW/BI deployment, it is extremely important that teams are led by people that are seasoned veterans in DW/BI. These senior architects and developers will influence the organization to align its thinking related to DW/BI solution. This includes training the team to use the DW/BI software and using experienced hires to set the strategic direction and implement policies and procedures that ensure success of the DW/BI project.

http://www.ecrmguide.com/daily_news/article.php/419280/10-techniques-for-successful-business-intelligence-deployment.htm

 


The world of analytics has changed, now top organisations can harness the power of multiple views and opinions from within. Analytics has traditionally been more one to one relationships. Someone requests a piece of analytics and an analyst creates and prepares the work. This is then presented back for consideration and action.

Now with the deployment of social media, analytics can be created and then multiple analysts can comment. The multiple “eyes” or “views” give greater insight than just one individual with their own skills and bias. Now managers can see various aspects of the analytics from varying points of view. It may be that all people see the same and recommendations are the same, but it may be that some arying aspects are revealed that were not considered. This was the approach I took at the World Bank to assure we had all views harnessed from many sources. It was amazing how many varying views can forward from the same data when applied in many various geographies and departments. This is the new world of analytics and can be harnessed if doen the right way.

I saw the value in this approach and now am showing other organisations how to harness this amazing power. This approach allows “Cognative Variance” to be applied to give a richness of understanding and the harnessing of corporate knowledge. It can also be used between organisations in an industry. It is a little like spreads theory, where you harness the power of many individuals thought to show a bias or trend.

Analytics id evolving, and the next step is collaborative analytics.

Get in early, and gain the first to market advantage.

Paul Ormonde-James