How we handle Morningstar data at PlanTools
At PlanTools we have a simple workflow for handling data feeds from Morningstar®. This simplicity makes it easy and efficient for our Developers and QA Team to utilize and validate data used to populate our platform of retail and custom solutions. Additionally, the time savings allows us to remain productive on multiple projects instead of being bogged down on a single project.
Here is a little background. We currently import over 46gb’s of data every single month; which includes 50+ data feeds on 213k+ funds and 12k+ data points per fund. That’s a lot of +’s… =). Morningstar delivers this data using an SFTP in a compressed format. Once these feeds are on our servers, we uncompress the files and import them into our SQL database which takes around 45minutes. We then import the uncompressed data into our PlanTools application, which is a cumbersome process due to the size of the files, that takes 6 hours. Of course, importing 46gb of data in any system takes a while, so sit back, order lunch and watch a movie why you wait. This has been our modus operandi until now…
This timeline was unacceptable to our developers who accepted the challenge to think outside the box and develop a solution to simplify the process. A challenge they met and conquered by first creating a program to reduce the size of any file into smaller chunks. This allowed us to easily open multiple files in an XML editor to pull the raw data for any investment into a single report to view and validate data efficiently.
While this was a vast improvement, we were not satisfied with the results, we knew we could do better. So we went back to the drawing board and devised a plan to instantly pull down all data for a single investment within seconds in a legible and clean format within any editor. Currently, Morningstar is the only company to have a tool like this, but it’s for internal use only. Subscribers to Morningstar’s data can request a report like this directly from Morningstar but the request takes days to fulfill and it is provided in multiple pieces tied to each data feed provided by each department responsible for maintaining the data. However, upon receiving each data feed you are still left with the challenge of compiling the data into a single report. The only viable solution was the development of a customized internal solution for PlanTools use which I am happy to report we accomplished. Now…
What takes Morningstar days, takes us seconds and with this efficiency we can optimize our product development, QA testing and discovery processes.
Our QA team has significantly benefited from the efficiencies of this new functionality by permitting them to utilize a method called Statistical Sampling. If this terminology is new to you, it means we take a random sample of funds instead of the entire database of funds. By proper implementation of statistical sampling your QA can dramatically cut down on the number of samplings necessary to validate accuracy of the new functionality.
What we have built not only benefits developers, but QA testers and business leaders looking to evaluate how to use Morningstar data.
Creating powerful applications for our subscribers is our only primary purpose but to accomplish this objective we focus on implementing processes and developing custom functionality internally that will enable us to achieve our primary objective of building better solutions for our subscribers. The less time we spend waiting, sorting, and validating data the more time we have creating, enhancing, and building solutions that will benefit our subscribers.
Justin is the Chief Technology Officer and Project Manager at PlanTools. Our mission is to design, build and maintain the most powerful retail and custom solutions in the financial industry.
Enjoyed this post, click the♡ below! Follow us to learn more.