Over the last decade, GPU computing has evolved from a fringe technology to a core element that powers some of the world’s top supercomputers.
Risk Management with Parallelized Algorithms on GPUs
Partha Sen, who works with Fuzzy Logix, will deliver some fodder for the big data, financial services folk who tend to turn out in droves for the GPU conferences (even if they won’t talk to us about the specifics of what they’re working on…ahem). This is probably one of the most advanced sessions that BI/data mining folks could select, but it sounds fascinating. As Sen describes, “The challenge with intra-day risk management is that a very large number of calculations are required to be performed in a very short amount of time. Typically, we may be interested in calculating VaR for 100 to 1000 securities per second based on 100 million potential scenarios. The magnitude of these calculations is not Utopian but it reflects the reality of modern financial institutions and exchanges. In this presentation, we outline how the complex problem of intra-day risk management can be solved using parallelized algorithms on GPUs. The methodology has been proven in a POC at 2 financial institutions.”