Which software do you like best for business intelligence, fraud detection, web analytics, data mining, or risk management?
What do you think of SAS Enterprise miner? Which procedures do you use most? Do you use it mostly to process large datasets (more than 10MM rows, more than 100 variables)? If you use SAS/Stat, which procedures do you find most useful (please give context, e.g. pharmaceutical, small samples etc.)
What about open source software (R) and Splus, Salford Systems, JMP etc? My experience is that they process your entire data set in RAM, and are thus limited when dealing with large data sets (I could not get JMP decision trees to process more than 300,000 rows of summarized data). Did you manage to solve some problems thanks to Syncsort?
When looking for a vendor, comparative studies or benchmarking, what is your favorite reference?
Tags: R, SAS, datasets, large, open, software, source, statistical
-
▶ Reply to This