Navigating the World of Big Data Analytics
November 4, 2016
New analytics software and methods are developing at such a rapid pace that it’s difficult for the enterprise to keep up. It’s impossible to learn about them all – but some of them may be very useful for your business.
Even the most talented and innovative data science and analytics teams are faced with two key challenges. How can you stay on top of the ever-changing analytics landscape to know what is worth trying? And, once the right tools are identified, how can you get the analytics and technical resources to test them in a timely manner?
Analytics teams we’ve worked with in banking and insurance have encountered some combination of the following problems:
- They have a defined analytical question, but don’t know which of several potential approaches is best suited to answer it – and maybe aren’t even aware of all the latest relevant tools.
- They have an idea for a technical solution to try, but don’t have enough hands-on experience with the latest software/versions/packages available to get up and running efficiently.
- They have the capability to test a new solution, but for one reason or another (cumbersome procurement processes, compliance issues around open source software, budget, bureaucracy) can not get easy access to the software they need.
- They can not easily get the right data and tools in the same place. Security concerns or current compliance regulations prohibit them from moving internal data to the cloud or from bringing external data sources inside, or they simply don’t have the resources and experience to navigate, integrate and clean the external data they want to use.
The internal roadblocks we’ve seen vary. The internal IT procurement process makes it too cumbersome or impossible to acquire new software. Compliance issues prohibit external data or open source software. PII data can’t be brought into AWS. Limited resources prevent them from staying current on all the tech and data sources available.
A truly agile analytics group would be able to bring external and internal data together in one place where they could quickly test out a variety of the latest tools available with experts to guide them along. Today’s financial service enterprise can’t achieve this analytical agility on its own – it’s simply too much to ask given the resource constraints, internal structure, and regulatory environment.
This is exactly why we created our Agile Analytics Lab - to help our partners test new technologies, new methodologies and new data sets quickly in an environment that can scale up and down and that meets all of their security and compliance requirements. Of course, compute resources and software licenses are useless without the people who know how to leverage them. Our lab is staffed with top analytics experts and a tech team that is up-to-date on the latest big data platforms. Our data scientists, developers, and infrastructure teams are constantly exploring the frontiers of analytical methods, software, and hardware and our agile approach allows us to master and deploy new platforms quickly. Our process is highly collaborative which makes it easy to turn over any insights, specs and configurations to our partner’s internal IT organization when it makes sense to bring something back in house.
Our Agile Analytics Lab is also a place to experiment with new public data sets and platform APIs. We’ve already done the work to integrate data sets such as:
- Weather, traffic and safety data including historical, current and predictive
- Demographic, financial & economic data from markets around the world
- Social media feeds from sources including Twitter, Instagram and Flickr
- Marketing & advertising data from the top ad serving, marketing and sales automation platforms
At Fulcrum we manage agile analytics labs for some of the largest banks and insurance companies in the world. We help them achieve analytical agility by providing them virtual infrastructure to spin up exactly the servers and software they need. We guide our partners from the selection process to implementing packages to support in building models with the latest tools. We curate a variety of external data sources so that they are easy to integrate with their existing data – this involves identifying, cleaning, and providing metadata so they are ready to go. It’s the job our data science team to stay on top of the latest developments so that our enterprise partners can focus on their job – obtaining the best analytical solutions for their business.