{k}
Blog post illustration

AI, Automation, and Alpha: Alternative Data Trends 2025

Adrian Krebs,Co-Founder & CEO of Kadoa

Just back from Neudata London, where the alternative data community gathered to discuss the latest trends, challenges, and opportunities. We had many discussions about how the alt data landscape is evolving in the age of AI.

Based on the the sessions and customer conversations, here’s what's top of mind right now:

Trends

1. AI is Everywhere

It's impossible to ignore the impact of AI and Large Language Models (LLMs). Funds aren't just experimenting; they're actively integrating these tools across their workflows. We heard how firms like Morgan Stanley are piloting AI across their entire fundamental research process – from translation and summarization to content generation and data insights. Others mentioned building proprietary internal tools (like variations of 'Firm-GPT') specifically for tasks like Q&A on internal documents, advanced sentiment analysis, or complex entity resolution, aiming to unlock efficiencies and new analytical angles.

2. Automation of Manual Research

A recurring pain point is the sheer amount of time highly skilled analysts spend on low-value, manual data tasks. Copying and pasting information from websites, filings, or disparate sources into spreadsheets remains a significant time sink. There's a strong push towards automating data ingestion, ensuring seamless updates, and centralizing data access within firms to free up analysts for higher-value analysis.

Research analysts are consuming so much data, and they are mostly doing it by hand.

3. Fast Dataset Evaluation is Key

With a universe of potential datasets, data sourcing and quant teams need to move quickly. The ability to rapidly evaluate whether a new dataset holds potential uncorrelated alpha is critical. Vendors are being pushed to make trial periods and data testing as frictionless as possible. The faster a potential buyer can confidently reach a "yes" or "no" decision on a dataset's value, the better it is for everyone involved. Paul Walsh from Quantbot emphasized the need for decisiveness, suggesting a mindset where successful trials quickly lead to adoption.

4. Processed Insights Instead of Raw Data

The landscape is evolving beyond just providing raw data feeds. Increasingly, vendors are moving up the value chain by applying their own AI/ML models to generate insights, signals, and analytics directly. This "signal-as-a-service" model caters to funds that may lack the internal resources to process raw data at scale or prefer to consume pre-analyzed information.

Expect more signal-as-a-service instead of just raw data.

5. Proprietary Data & Custom Models

As basic financial and widely available alternative datasets become more commoditized, the real edge lies in uniqueness. Premium value is placed on proprietary or hard-to-acquire data sources – think specialized B2B transaction data, niche web scrapes targeting specific industries, or insights derived from expert network calls. Critically, it's not just about having unique data; it's about how funds process it with their own custom models and analytical frameworks. Connecting proprietary AI capabilities to these unique data streams is seen as key to maintaining a durable analytical advantage.

6. Data Quality and AI Transparency

Amidst the excitement around AI and new data types, fundamental requirements haven't changed. Poor data quality remains a significant obstacle. Access to clean, reliable, point-in-time data is absolutely non-negotiable, especially for rigorous backtesting of strategies. Furthermore, as AI-driven insights become more integrated into investment processes, portfolio managers and compliance teams are demanding full transparency. They need to understand the lineage of the data and the methodology behind how AI models generate insights to trust and effectively utilize the outputs.