LAS VEGAS -- SAS Institute this week unveiled tools it says it easier for its enterprise customers to use the company's business analytics software to analyze data stored in Hadoop environments.
At its Premier Business Leadership Series here, SAS unveiled an upgraded version of its High Performance Analytic Server that adds support for Hadoop and the Hadoop Distributed File System (HDFS).
The update offers corporate users a way to integrate data stored in Hadoop with data from other sources, and then analyze the combined data set.
The goal is to give companies a way to quickly extract useful business insights from massive amounts of structured and unstructured data, said Jim Davis, chief marketing officer at SAS.
The updated High Performance Analytic Server will let users directly load data from Hadoop data stores and run analytics on it at very high speeds, Davis said.
The server takes advantage of a data access technology called SAS/Access Interface for Hadoop that is said to allow enterprises to use SAS environments to access, read, write and update data stored in Hadoop.
All Hadoop data loaded into the SAS server using the Access Interface will appear native to the server so there is no need to use SQL or any other database specific language to query the data, Davis said. The result is that companies will be able to integrate Hadoop data with their other data and quickly analyze it, he claimed.
"The value of High-Performance Analytics is not just in being able to analyze all of your data. It's in being able to do things differently than you've ever done them before," added Tonya Balan, director of analytics product management at SAS.
The SAS offerings are a "strong response" to the growing flood of data at many companies, said Larry Seligman, director of BI and analytics at the Intercontinental Hotels Group.
"Every organization that uses high volume data like clickstream, sensor, or location data must confront the question of how to visualize it, segment it, mine drivers from it, and forecast against it," he said via email.
"The analytics technologies that will get us to 2015 will be radically different from what we are using today," he added.
The SAS server was first released last year and is designed to help companies analyze terabytes of structured and unstructured data in near real-time.
The product currently runs on database appliances from Teradata and EMC's Greenplum unit, and uses technologies like in-memory analytics and in-database analytics to speed data analytics.
SAS this week also released an enhanced version of its SAS Text Miner technology for analyzing unstructured text data from sources such as blogs, newsfeeds and call centers.
The update adds a new Text Rule Builder designed to let companies more easily classify big data content in real-time, according to SAS. It will allow companies to apply linguistic rules and statistical methods to classify and categorize massive volumes of unstructured data far more easily than possible previously.
SAS also released the SAS DataFlux Event Stream Processing Engine, a complex event processing technology designed to analyze streaming data as received by a company. SAS said the tool will help companies examine and analyze clickstream data, market data feeds and other data streaming into their systems for applications like online ad optimization and fraud detection.
Jaikumar Vijayan covers data security and privacy issues, financial services security and e-voting for Computerworld. Follow Jaikumar on Twitter at @jaivijayan, or subscribe to Jaikumar's RSS feed . His e-mail address is [email protected].
Read more about business intelligence/analytics in Computerworld's Business Intelligence/Analytics Topic Center.