Automated trading systems now account for the majority of volume on major financial exchanges. According to figures cited by the U.S. Securities and Exchange Commission, algorithmic strategies represent roughly 70 percent of equity trading volume in the United States. That dominance did not happen overnight. It is the result of seven decades of incremental development, starting with a single commodity fund manager who believed that rules mattered more than intuition.
Richard Donchian and the Birth of Rules-Based Trading (1949)
The traceable origin of systematic trading sits with Richard Donchian, who in 1949 launched Futures, Inc., one of the first publicly held commodity funds to operate on predefined buy and sell rules rather than discretionary judgment. Donchian developed what became known as channel breakout rules: a buy signal triggers when price exceeds its recent high, a sell signal when it drops below its recent low. These rules are still embedded in modern trading platforms under the name Donchian Channels.
In that era, executing a rules-based system required charting prices by hand from ticker tape. There were no computers, no spreadsheets, no live data feeds. Traders spent hours manually plotting price movements and identifying signals before calling a broker. The process was slow, but the underlying principle was sound: remove emotion from the entry and exit decision. That principle has not changed in seventy-five years.
The CMT Association, the body that awards the Chartered Market Technician designation, formally credits Donchian as a foundational figure in systematic trading methodology.
The Turtle Traders and Mathematical Entry Rules (1983)
The concept gained its most public demonstration in 1983 through an experiment run by commodities trader Richard Dennis. Dennis made a bet with his partner William Eckhardt over whether great traders were born or made. Dennis believed trading could be taught as a codified skill. He recruited a diverse group of people with no trading background, gave them a specific set of trend-following rules, and allocated real capital.
The results supported Dennis. The group, known as the Turtle Traders, applied breakout signals, position sizing formulas, and strict stop-loss rules to commodity futures and generated substantial returns. The experiment demonstrated that a well-designed system, applied consistently, could outperform discretionary judgment.
At roughly the same time, John W. Henry, later the principal owner of the Boston Red Sox and Liverpool FC, built a significant asset management business on mathematical trend-following models applied to futures markets. His firm managed billions using purely systematic approaches across multiple asset classes.
Academic research has supported the durability of these methods. A widely cited paper by Lasse Heje Pedersen and colleagues, published in the Journal of Financial Economics in 2012, found that time-series momentum strategies produced consistent risk-adjusted returns across asset classes over long historical periods, providing theoretical grounding for what practitioners had observed since the 1950s.
Personal Computers and Retail Access (1990s)
The 1990s transformed systematic trading from an institutional activity into something accessible to individual investors, primarily through affordable personal computers and dedicated software.
TradeStation became the platform through which retail traders could code their own rules, backtest them against historical data, and generate signals from home. For the first time, a private trader could replicate the kind of systematic logic that had previously required institutional infrastructure and a team of quantitative analysts.
This decade also saw the first commercial trading systems made available for purchase. Developers sold rule sets coded into platforms like TradeStation, allowing buyers to apply someone else’s methodology to their own account. The relationship was still direct: if the software had a problem, the developer fielded the call personally.
The arrival of commercial internet access in the late 1990s changed the speed of everything. Live data feeds became affordable, and traders could connect their systems to real-time price streams, generating signals continuously rather than running end-of-day calculations. The latency between signal and execution, previously measured in hours, began to collapse.
Electronic Exchanges and the CME Globex Revolution (Late 1990s)
The structural change that completed the automation of execution came from the Chicago Mercantile Exchange. The CME introduced E-mini futures contracts in 1997, starting with the S&P 500 E-mini. These contracts were smaller in notional value than standard futures and were designed to trade electronically on the Globex platform, bypassing the open-outcry trading floor entirely.
Globex meant that a computer could not only calculate where an order should be placed but send that order directly to the exchange without any human in the chain. A system could identify a signal, size the position, and transmit the order in a sequence faster than a human could pick up a phone.
This is the moment systematic trading became fully automated in the modern sense. The academic literature on this shift is substantial. Terrence Hendershott, Charles Jones, and Albert Menkveld, in a study published in the Journal of Finance in 2011, found that the introduction of algorithmic trading on the New York Stock Exchange was associated with improved liquidity and narrower bid-ask spreads, suggesting automation had measurable market-quality benefits beyond just the firms deploying it.
The Signal-Following Model and Subscription Trading (1998 Onward)
A parallel development established the commercial structure through which most retail traders interact with systematic strategies today: following someone else’s signals for a recurring fee rather than building and maintaining a system themselves.
The origin of this model in the futures industry is partly traced to Attain Capital, where a client named Jack Telford agreed to allow other clients to follow the signals of a TradeStation system he had developed, in exchange for a modest fee. That arrangement formalized what became known as the system-assist model: a developer codes a strategy, subscribers receive the signals, and neither party needs to manage the technical infrastructure of the other.
Collective2, which launched in 2001, was among the first platforms to aggregate multiple strategy developers and allow subscribers to connect signals directly to a brokerage account. This intermediary layer separated signal generation from execution and removed the developer from the subscriber’s daily workflow entirely.
Mirror Trading, Copy Trading, and Social Platforms (2005 Onward)
The next phase arrived when signal following became social. Mirror trading, pioneered in the forex market by companies including Tradency around 2005, allowed retail traders to automatically replicate vetted strategies in real time. The trader selected a methodology, set allocation parameters, and the platform handled synchronization with their brokerage.
eToro extended this model in 2010 with CopyTrader, which let users follow and automatically replicate the positions of other individual traders rather than institutional strategies. The social layer made performance transparent: anyone could view a trader’s track record, drawdown history, and open positions before committing capital. By 2012, eToro reported over two million registered users and had become the dominant brand in what the industry was calling social trading.
Research from MIT by Pan, Altshuler, and Pentland (2012), analyzing trading behavior on eToro, found that moderate levels of social influence improved portfolio returns, while excessive copying homogenized behavior and increased systemic fragility. That tension remains relevant today.
Telegram and the Decentralized Signal Model (2017 Onward)
Starting around 2017, Telegram became the default infrastructure for a large segment of the crypto trading signal industry. A signal provider creates a channel, subscribers join it, and trade calls are broadcast as messages. Execution is manual, semi-automated via bots reading the channel and placing orders, or fully automated through API connections between the channel and exchange accounts.
This architecture removed the intermediary platform entirely. There is no Collective2 or eToro standing between provider and subscriber. The tradeoffs are significant: no standardized performance verification, no regulated dispute mechanism, no enforced track record transparency. The signal provider’s reputation is the only quality filter. The model attracted a large volume of activity in crypto markets precisely because it required no regulatory registration and could scale from one subscriber to one hundred thousand on the same infrastructure.
AI Agents: Autonomous Decision-Making (2024 Onward)
The most significant shift currently underway replaces the human signal provider with an autonomous AI agent capable of reasoning, adapting, and executing without a fixed rule set.
Traditional algorithmic trading systems, including everything from Donchian’s channel rules to Telegram bots, operate on logic that a developer coded explicitly. The system does exactly what it was told to do. AI agents built on large language models work differently. They can process unstructured information, including news articles, earnings call transcripts, social media sentiment, and macroeconomic commentary, alongside numerical price data, and synthesize those inputs into trading decisions.
Research published on arXiv in late 2024 by Xiao, Sun, Luo, and Wang introduced TradingAgents, a multi-agent framework that assigns distinct roles to different LLM instances within a single trading workflow: analyst agents gather and interpret data, a risk management agent monitors exposure limits, and a trader agent synthesizes the inputs and executes decisions. The architecture mirrors the structure of a small trading desk, with each function handled by a specialized agent rather than a human. The framework was tested against standard benchmark strategies including buy-and-hold, MACD, and SMA across equities from January to November 2024.
AI agents are also increasingly active in prediction markets. Platforms such as Polymarket have seen autonomous agents execute thousands of trades within single months, with some early-stage deployments reporting significant returns on individual positions. CoinDesk The Olas protocol, one of the more documented examples, frames the goal explicitly as building user-owned agent economies where individuals deploy autonomous software that generates value on their behalf across markets.
The global market for AI in trading is projected to grow from $21.59 billion in 2024 to $24.53 billion in 2025, a compound annual growth rate of 13.6%. Nasscom The shift is not limited to institutional players. Platforms like Alpaca have made it possible for retail traders to connect LLM-based agents to live brokerage accounts with relatively minimal code, using natural language instructions that the agent interprets and converts into API calls.
The distinguishing characteristic of this generation of systems is adaptability. A rule-based system trading a breakout strategy does so identically in 2024 as it did when first coded. An AI agent, exposed to changing market conditions and new information, can update its behavior without a developer rewriting its logic. That capability introduces both new potential and new categories of risk: agents that adapt incorrectly, overfit to recent data, or interact unpredictably with other automated systems create failure modes that fixed-rule systems do not.
Regulatory attention is following the technology. The SEC and CFTC have both signaled interest in the accountability questions raised by fully autonomous trading systems, particularly around who bears responsibility when an agent causes market disruption.
A Continuous Thread
The structural continuity across seventy-five years is real. Donchian’s 1949 rules, Dennis’s turtle trader signals, a TradeStation system sold by subscription, a Telegram channel broadcasting calls, and a multi-agent LLM framework analyzing earnings transcripts are all solving the same problem: generating a decision about when to buy and when to sell, and acting on it faster and more consistently than unaided human judgment allows. The medium changed at each step. The problem did not.
References and Further Reading
- Hendershott, T., Jones, C.M., and Menkveld, A.J. (2011). “Does Algorithmic Trading Improve Liquidity?” Journal of Finance, 66(1), 1-33. https://onlinelibrary.wiley.com/doi/10.1111/j.1540-6261.2010.01624.x
- Xiao, Y., Sun, E., Luo, D., and Wang, W. (2025). “TradingAgents: Multi-Agents LLM Financial Trading Framework.” arXiv preprint arXiv:2412.20138. https://arxiv.org/abs/2412.20138
- Pedersen, L.H. et al. (2012). “Time Series Momentum.” Journal of Financial Economics, 104(2), 228-250. https://www.sciencedirect.com/science/article/pii/S0304405X11002613
- Wikipedia: Automated trading system. https://en.wikipedia.org/wiki/Automated_trading_system

