Archive for the ‘Business Intelligence’ Category


http://www.toptechnews.com/story.xhtml?story_id=131008O5BA6H&page=2


http://www.cio.com.au/article/387400/ibm_invest_100_million_big-data_analysis_research/

The information in this e-mail message and any files transmitted with it are intended to be confidential and for the use of only the individual or entity to whom they are addressed. The message and files may be protected by legal professional privilege, or other legal rules. The confidentiality of and privilege applying to this message and files is not waived if this message or files has been sent to you by mistake. If the reader of this message or files is not the intended recipient, you are notified that retention, distribution or copying of this message and files are strictly prohibited. If you receive this message or files in error, please notify us immediately by telephone or return e-mail and delete all copies from your computer system. It is the recipient’s responsibility to check this message and files for viruses.
Thank you.


The world  of Business Intelligence is changing. Access to information in the corporate world is now a must not just a nice to have. With the advent of social media, information can be easily accessed by all on both computing and mobile platforms. I recently deployed a substantial social media platform across the World Bank and it was seen a a major step forward in communications. But what was generating more interest in collaborative analytics where analysis can be done by many analysts anywhere in the world on the same data set. This gives great depth and insight from many perspectives and stops potential singular errors that can eminate from traditional approaches.


It is no mistake that social media is an interal part of the future BI initiatives and is changing the way we will deliver value to business. From collaborative analytics comes collaborative decision making. To achieve this collaborative decision-making (CDM) environment, Business Intelligence (BI) software is beginning to merge with Web 2.0 technologies, harnessing their rich, open-access, easy-to-use functionality that users have come to expect. The merging of BI and Web 2.0 technologies has given rise to the new concept of Social and Collaborative BI – a type of CDM platform. This platform, like social Web 2.0 technologies, is designed around the premise that anyone should be able to share content and contribute to discussion, anywhere and anytime.

IDC predicts that 2011 will be the year where the trend of embedding social media style features into BI solutions will make its mark, and that virtually all types of business applications will undergo a fundamental transformation.

IDC, along with many other analytics firms, also believes the emerging CDM software market will grow quickly, forecasting revenues of nearly $2 billion by 2014, with a compound annual growth rate of 38.2 percent between 2009 and 2014.


Leading edge companies are starting this approach and gaining early insight into a real competitive advantage driver


Paul Ormonde-James


In-memory computing is making its way out of R&D labs and into the enterprise, enabling real-time processing and intelligence

The massive explosion in data volumes collected by many organisations has brought with it an accompanying headache in terms of putting it to gainful use.

Businesses increasingly need to make quick decisions, and pressure is mounting on IT departments to provide solutions that deliver quality data much faster than has been possible before. The days of trapping information in a data warehouse for retrospective analysis are fading in favour of event-driven systems that can provide data and enable decisions in real time.

Indeed, real-time computing is a new catch cry across the technology industry. A hypothetical example is a retailer that can monitor a customer’s real-time behaviour in store or on a website and draw on historical data from their loyalty system regarding spending patterns to make them offers that they might respond to in that moment.

Such a scenario has long been dreamed of, but it is being made possible today for retailers and other industries thanks in part to a technology known as in-memory computing.

In-memory computing works by bringing data physically closer to the central processing unit.

Chip manufacturers have been on this path for some time with the integration of Level 1 and Level 2 caching into microprocessors, as moving indices into Level 1 cache makes them more quickly accessible. Moving out through the caching levels usually results in a loss of speed but an increase in the size of data storage.

In-memory technology follows the same principals and moves data off disks and into main memory, eliminating the need to run a disk-seek operation each time a data look-up is performed, significantly boosting performance.

The idea of running databases in memory is nothing new, and was one of the foundations of the business intelligence product QlikView, released by QlikTech way back in 1997. More recently other technology companies have jumped on the bandwagon, notably SAP and TIBCO. What is making in-memory so popular now is that plunging memory prices have made it economical for a wider range of applications.

Gartner’s managing vice-president for business intelligence and data management, Ian Bertram, says the key application for in-memory technology today remains business intelligence, where it enables data to be conducted on the fly, with accompanying faster refreshing.

“It’s creating a data warehouse in-memory, which is a much faster technology than doing stuff on disk,” Bertram says. “Disk clearly has a much greater capacity, but it is slower. In memory, it is instantaneous. “The value comes in for people who have to make decisions really quickly, as they don’t have to wait 12 hours for an OLAP [online analytical processing] cube to be built anymore.”

As a pioneer of in-memory technology, QlikView is used by a range of Australian companies for business intelligence, including the packaging company Amcor, which is using it to make better decisions on the profitability of its delivery operations.

QlikTech’s senior vice president of products, Anthony Deighton, claims the in-memory architecture has enabled his company to build a product that focuses on ease of use.

“When users interact with our product, they can click and find their own path through the data, and do that interactively and at high speed,” he says.

The marketplace for in-memory products is rapidly becoming crowded. SAP co-founder, Hasso Plattner, has been driving the development of in-memory computing through the eponymous Institute that he founded at the University of Potsdam in 1998. Its efforts were first unveiled at SAP’s annual SAPPHIRE conference in 2009. This year, SAP announced the High-performance Analytical Appliance (HANA) project as a roadmap for in-memory computing.

About 20 companies were invited to join up to the trial, but more than 40 have now done so, and SAP will be selling in-memory appliances by the end of 2010.

SAP Australia and New Zealand products and solutions group director, John Goldrick, says that in-memory data look-up times are further reduced through the data being stored in a column format (as is the case with SAP’s Sybase IQ database) rather than rows, which means that for many operations only the contents of one column need to be read, not the entire table.

As each column is comprised of records of the same data type and size, the databases can be efficiently compressed. According to Goldrick, about 40 per cent of data storage is no longer required, along with about 80 per cent of data look-up activity. In one instance, he says, a 1.8TB database was compressed down to just 70GB.

“All of a sudden you are moving down to being able to use raw data faster than you could ever use the aggregated data, so the whole processing time becomes faster,” Goldrick says. “We did a study and worked out that we could hold the entire system of all but four of our largest customers on one blade, and get the processing speed to be 10,000 times faster than it currently is from disk.”

Most in-memory BI systems today draw data from of an existing source such as a data warehouse. Reporting tools can then be pointed at the in-memory database to generate reports.

 

Transactional reporting

 

The next step is to move transactional systems to in-memory, where it is possible to update and analyse transaction information in real time. In June, TIBCO released a series of new in-memory products for high-speed data management, the ActiveSpaces Suite, to provide fast shared memory for distributed applications to more quickly exchange and process real-time data.

Chairman and chief executive, Vivek Ranadivé, says it has immediate application in fields where there are high numbers of transactions, such as airlines, banking or utility smart grids.

“You can analyse things as they happen in your transaction system in real time,” Ranadivé says. “There is no extracting and translating the data into your data warehouse to run your analytics.”

His vision is to enable corporations to evolve from being transaction-driven business to become event-driven, where events within the organisation can trigger actions in real time based on existing data; a process he describes this as the ‘two-second advantage’.

“The two-second advantage is about having a little bit of the right information in the right context just a little beforehand — whether it is two seconds, two hours or even two days,” says Ranadivé. “By sensing what is happening around them, businesses can constantly adjust and react a little ahead of the competition or in anticipation of a customer’s wants and needs.”

Underlying this is in-memory architecture, which he says offers fast, shared memory to accelerate the exchange and processing of real-time data and events. Hence, in-memory technology is generally spoken of in the same breath as the broader movement towards real-time information processing system

 


According to a Forrester Consulting study commissioned by Geographic Business Intelligence® software developer Alteryx®, LLC, industries such as retail, real estate, banking and telecommunications are moving quickly toward single-vendor BI solutions and pre-packaged content, converging business processes toward a new generation of best-practice decision-making.

Orange, CA (PRWEB) January 19, 2011

Recent advancements in software capabilities, technologies and deployments are shifting Business Intelligence (BI) paradigms, and are changing the way companies do business.

According to a Forrester Consulting study commissioned by Geographic Business Intelligence® software developer Alteryx®, LLC, industries such as retail, real estate, banking and telecommunications are moving quickly toward single-vendor BI solutions and pre-packaged content, converging business processes toward a new generation of best-practice decision-making.

The Forrester study from October 2010, Integrated And Flexible Business Intelligence Solutions Are Vital For Better Business Decisions, identifies and outlines the state-of-the-art in BI best practices, and how convergence of end-to-end solutions and business processes are boosting enterprise performance and ROI. Flexible use scenarios are cited in various examples of how next-generation deployments are driving decisions in the cloud for convenience, and on-premise for security priorities. For a copy of the report, click on this link: http://alteryx.com/How-Alteryx-Performs/Pages/White-Papers/Forrester-Report-on-BI-Platform-Requirements.aspx

“Anywhere where it’s cost-effective and reveals proprietary data, there are benefits to a flexible deployment option such as on-premise, hosted or SaaS,” said an IT director at a leading apparel company who was interviewed during the study. “[The] question is, does anyone else have access to it?”

The findings support Forrester Research, Inc.’s Boris Evelson’s December 23, 2010, BI Maturity In The Enterprise: 2010 Update that identifies strong trends in agile BI with less centralization, and more pervasive use across wide-ranging vertical markets and industry categories. Talent and skills are in demand, as more enterprises are combining BI and spatial analytics into their traditional business processes. Evelson will share new research and address the state-of-the-art in BI strategies when he keynotes the Alteryx Business Leader Summit on March 9th at the Omni Interlocken Resort in Broomfield, Colo.

“Although BI is one of the original software application disciplines, spatial analytics, pre-packaged content and single-vendor solutions are helping to shift paradigms across multiple vertical markets in the BI software space,” said Alteryx CEO Dean Stoecker.

“More than a map in the cloud, spatial requires integrated, end-to-end solutions and an architectural platform that can cross the chasms between the desktop and the cloud,” said Stoecker. “The code base of the future is already transcending geospatial processing and moving toward solving multi-disciplined business problems, and changing the very nature of BI and its inter-relationship with the science of business process.”

The company’s latest offering, Alteryx 2011, is a seamlessly integrated BI platform delivering robust consumer profiling and behavior segmentation tools, advanced processing of spatial and non-spatial data, and a full range of BI functionality, including Spatial Processing; Extract, Transform and Load (ETL); Address Hygiene and Data Quality; BI Analytics; Reporting and Visualization; and Customer Data Integration (CDI).

Alteryx 2011 builds on the company’s proven technologies and now incorporates several enhanced BI features previously only available as standalone offerings:

  • Location Optimization for planning business network expansions and/or contractions
  • Behavioral Analytics for consumer profiling and segmentation
  • Automated Module Packaging for simple, enterprise-wide distribution
  • New Tool Suites with enhanced analytics and expanded charting options
  • New Macros that add iterative processing capabilities and additional optimization options

About Alteryx

Solving business problems for nearly a quarter-million users worldwide, Alteryx is driving the global revolution in Geographic Business Intelligence®. Through smart, extensible solutions from the desktop to the web, the Alteryx software platform delivers the fastest, most comprehensive consumer, business and market insights to the Fortune 500, mid-market companies, government and academia. Alteryx hosted, on-premise and SaaS solutions integrate spatial intelligence into enterprise workflows, seamlessly scaled across local, regional and global markets. Inspiring ingenuity since 1997, Alteryx, LLC is headquartered in Orange, Calif., with its Technology Center in Boulder, Colo. For more information please visit http://www.alteryx.com.

Alteryx and Geographic Business Intelligence are registered trademarks of Alteryx, LLC

Read more: http://www.benzinga.com/press-releases/11/01/p790898/business-intelligence-processes-converge-to-drive-next-generation-busin#ixzz1BZAMGw7x


Good article on the changing appliance markei

 

……………………………………………………………………………………………………….

One year after announcing a $250 million, three-year pact to deliver next-generation data center technology, Hewlett-Packard Co. and Microsoft today unveiled five appliances that offer Exchange and SQL Server in turnkey configurations.

The two companies’ offerings range from a so-called private cloud in a box designed to virtualize databases to a $2 million data warehouse appliance. The latter marks the official release of Microsoft’s largest and most scalable version of its database to date, SQL Server 2008 R2 Parallel Warehouse edition, code-named Project Madison.

 

“We’re paying off on the announcement we made a year ago,” said Doug Small, HP’s director of infrastructure to applications business.

The new HP E5000 Messaging System for Microsoft Exchange Server 2010 is the first time Redmond’s core e-mail platform is available in a turnkey configuration. The companies say the system can be deployed in a matter of hours. Small emphasized the system’s design around high availability. “The solutions have high availability directly built into it to ensure secure communication across the whole messaging system,” he said.

A system designed for mid-sized business will support 500 mailboxes, each at 1 GB, with an enterprise version designed to accommodate 3,000 mailboxes at 2 GB apiece. Organizations can cascade multiple appliances to scale them out, Small said. The Exchange appliances, due to ship in March, will start at $35,000 each just for the hardware, software licensing is separate.

The other appliances are centered around SQL Server: they include the HP Business Decision Appliance, the Business Data Warehouse Appliance and the HP Database Consolidation Appliance.

The Business Decision Appliance is aimed at rapid provision of BI implementations, said Fausto Ibarra, Microsoft’s senior director of BI. “It’s designed as a self-service BI solution, with everything you need, software and hardware, to empower end users to analyze large amounts of data from any number of data sources,” he said.

Microsoft said the appliance is optimized for running both SQL Server and SharePoint and can be installed and configured in less than one hour. It also runs the PowerPivot data analysis tool for Excel and provides a single dashboard for auditing, monitoring and managing BI solutions running on the appliance. Pricing starts at $28,000 for the hardware only. It is available now.

On the high-end of the spectrum, HP today began shipping the Enterprise Data Warehouse Appliance, announced back in November, which the companies say is 10 times more scalable than previous SQL Server deployments and capable of delivering queries 200 times faster.

For those looking to build smaller data warehouses based on SQL Server,  HP announced the Business Data Warehouse Appliance, which is a scaled down version aimed at small and medium businesses, also built on SQL Server 2008 R2 Parallel Data Warehouse edition.

“The Enterprise Data Warehouse Appliance is targeted at the high-end data warehousing scenarios, but there’s also many cases where companies are looking for an entry level data warehouse or a data mart that compliments their enterprise data warehouse,” Ibarra said. “The HP Business Data Warehouse Appliance is aimed at smaller data warehousing deployments. Also it’s very complementary to the Enterprise Data Warehouse. It works together but it’s independent.” The smaller Business Data Warehouse Appliance will be available in June, Microsoft said. Pricing was not disclosed.

The last of the five is the HP Database Consolidation Appliance aimed at integrating hundreds or even thousands of databases into a private cloud environment. The system is optimized for SQL Server 2008R2 and Hyper-V Cloud which is designed for rapid deployment, Microsoft said.

“Nowadays most companies are looking to build private clouds to get the benefits of public clouds, get them into their own data center,” Ibarra said. “Essentially [that means] having a private cloud solution which can enable them to consolidate transaction applications, sharing pools of resources and making it very easy to allocate capacity.” The HP Database Consolidation Appliance will be available in the second half of this year. The company did not disclose pricing.

While the new appliances will be available direct and through any HP or Microsoft channel partners, both companies are emphasizing customers would be best served acquiring them from partners engaged in the companies’ joint Frontline Channel Partner program.

“We’re not constraining the product availability, we have a specific program that we’ve developed and funded as part of this initiative that has the targeted training and other resources available,” Small said. “So it will be in a partner’s best interest to sign up for either the SQL or Exchange subset of the Frontline partner program to get access to the program.”

 


Data warehouse technology from innovative leader pays big, tangible dividends for customers

DAYTON, Ohio, Jan. 14, 2011 /PRNewswire/ — Teradata Corporation (NYSE: TDC) today announced that it has been recognized for its technology domination in data warehouse landscape by a new study from the Information Difference research firm. The research report further validates Teradata’s leadership as the world’s largest company solely focused on data warehousing and enterprise analytics.

(Logo:  http://photos.prnewswire.com/prnh/20090909/TERADATALOGO )

“As a pioneer in the data warehousing market, Teradata has proven its ability to deliver a foundation for a wide range of business analytics, which supports both departmental and enterprise intelligence for companies of all sizes.  Teradata’s reference customers were some of the most satisfied in our survey, and we ranked Teradata highest for technology of all vendors,” said Andy Hayler, chief executive officer and co-founder of The Information Difference. “This performance and capability pays big, tangible dividends for Teradata customers by helping them to out-maneuver their competitors.”

“It is gratifying that Teradata was recognized, and this validates our strategy of helping customers achieve success through the use of business analytics. To support our customers, Teradata offers a complete platform family designed to perform in today’s toughest environments and address all manner of analytic requirements,” said Scott Gnau, head of development, Teradata Corporation.  “Leveraging our unique and unmatched architecture, the Teradata platform family not only offers market-leading capabilities, but delivers the earliest adoption of breakthrough technologies like flash storage, multi-core and in-memory processing, all in a seamless environment.”

The technology dimension position is derived from a weighted set of scores based on four factors: customer satisfaction as measured by a survey of reference customers, analyst impressions of the technology, maturity of the technology and breath of technology in terms of coverage against the Information Difference functionality model.