I had the chance to drop in on a SAS event this week and get a view of their latest technology around data visualization.
Most of the market has been aware for some time of SAS’s ability to aggregate massive amounts of data from large number of sources, and via their in-memory processing architecture, perform complex analysis of that data. Nonetheless, even though the technology has been around, they are continually improving their processing abilities and it’s very impressive to see the results live. As part of their demonstration, they ran a 9×9 variable analysis against a more than 1.2 billion line data set and returned the results in less than 4 seconds.
Granted, SAS’s stuff doesn’t come cheap and it’s not just a matter of sticking a box in your shop and having it up and running. However, if you’ve got a company that is faced with massive data sets (Big Data or even large data), significant financial exposures, and a need for near real-time analysis, the capabilities of SAS’s technology are pretty amazing and certainly worth looking into if you haven’t been aware of it.
Again, their analytics technology has been out for a while, has been well adopted in the energy and retail markets, and is somewhat “old news”. What I actually come to see was their newer SAS Visual Analytics product, particularly as it has been applied to CTRM. The product overlays their analytics engine and provides a very consumable way of looking at complex data and analysis’s, and supports a number of graphing techniques and formats that go well beyond those commonly found in CTRM. I was very impressed with the ability to run and visualize, in essentially real-time, position reports, scenario analysis, PFE, VAR and just about every other complex analysis you can imagine; and then being able to drill-down, again in real-time, to visually identify the key data elements that might be out of range or create the greatest exposures. Very impressive stuff.