Bringing Wall Street technology to bear on public service delivery.

TL;DR - I spent 7 years doing Technology in Wall Street. I've spent the last 2.5 years figuring out the urban science x Gov/Civic/Urban Tech scene. Some thoughts.

Two things happened on April 1, 2014 that significantly changed how I viewed my career in finance until that point. Michael Lewis' new book, Flash Boys was just out and 2 senior currency traders left the firm I worked at, on allegations of wrongdoing and collusion in the foreign exchange markets.  

I worked closely with one of those traders and was totally "clueless" on what was going on - borrowing a term from Hugh MacLeod's cartoon and Venkatesh Rao's Gervais Principle of Organizational dysfunction.

The Gervais Principle, via Venkatesh Rao's Ribbon Farm

The Gervais Principle, via Venkatesh Rao's Ribbon Farm

Amongst other things, I functioned as a liaison between the "quants",  crunching financial data and building algorithmic  models, and the traders on the trading floor.

The quants, mostly based out of London were soft-spoken, with thick English or Irish accents far from their trader counterparts who tended to be loud and boisterous.  One of the senior quants had a zen-like penchant of explaining the inner-workings of his team's algorithms using Lord of the Rings metaphors.

This algo is like Gandalf -  it watches, observes and when the price is right, grabs as much as it can get.

This algo is like a Hobbit -  it goes around snapping up liquidity at good prices unbenounced to the rest of the market.

...then the New-York based Trading floor, the nerve-center of this vast group-mind where traders of all colors, shapes, sizes, and personalities sit glued to their screens waiting to deploy or a Gandalf or a Hobbit into the ring where Trillions of dollars, euros, yen, pesos, and, swiss francs were traded quite literally at the speed of lightning.

Among this menagerie are salespeople who need to sell the abilities of said Gandalf/Hobbit to institutions or "high net worth" individuals who want to put gobs of money to work.

Let me also add that many of these traders do not have patience for flowery metaphors.

During this time, I was part of the Americas FX Support Team. Our jobs were to support and maintain a good chunk of the digital applications that powered Foreign Exchange Trading out of New York City.  We were 1 of 3 teams,  distributed across London, Hong Kong and Mumbai and observed a "follow-the-sun" protocol to sustain the 24/7/365 nature of Foreign Exchange trading. Unlike Equity markets where the NYSE/NASDAQ have fixed start and close times, currency is almost always changing hands, or in this case, banks.

To describe the technology that powers this scene will always be an understatement. When banks and hedge funds are ready to cut through mountains to gain an speed-up of a few milliseconds, the bleeding edge is somewhat, an unsatisfactory term to describe the technology and infrastructure. Big Data, bluntly put is a battered eye-lash or a yawn, at best.

A good chunk of my job was to analyze reams of data to prove why my trader's 150 million EUR/USD trade at 1.2234 did not execute in under 0.5 seconds or 500 milliseconds. In about as much time, I had to unpack and explain Gandalf and Frodo algo to a 6'5" trader who did not WIN in that moment and was counting on me to say that the technology was at fault.

More than half the time, he did not like the answer because the tech was not at fault. Even worse, we both knew that the real reason was that his fingers didn't move fast enough. There was little he could do about it - a dramatic routine ensues.

150,000,000 Euros for Dollars in <0.5 seconds

Wall street tech doesn't get talked about much in pop-tech media, yet it keeps gobs of money flowing in and out of NYC and the rest of the world. The only major significant news making Wall street event that involved tech during my career was the 2010 Flash Crash and that happened because things worked way too fast!

Taking a dip. &nbsp;May 6, 2010 via NYT / Bloomberg

Taking a dip.  May 6, 2010 via NYT / Bloomberg

The Financial Information Exchange (FIX) is one of the few data protocols that power the backbone of global finance and banking. From using your ATMs to converting $ to pesos for your next vacation, to your 401ks ticking (higher or lower) ; Trillions of FIX and SWIFT messages are exchanged between servers and desks in NYC and London, and sometimes Kansas City.

FIX is a feat of information engineering but is also largely self-serving. 

The FIX protocol via Wikipedia

The FIX protocol via Wikipedia

In a (ridiculously reductionist) nutshell, here is how FIX came to be.

In the early 1990s, a bunch of big banks knew that a great way to make more money was to, in effect, talk or type faster. The simplest way of doing that was to agree to speak a common language and come to agreements or disagreements really really quickly. 

A few savvy banks agreed to do it, started making money and everyone fell in line. Done. 

In 2012, The UK government published a document titled "Standards in computer based trading: a review" that provides amazing insight into the origins of FIX amongst other standards in electronic finance.

Keep Calm and Trade On

Keep Calm and Trade On

Selected excerpts

  • FIX was created when Fidelity Investments asked Salomon Brothers to find a way for passing information about "pre trade liquidity" (you can glaze over the terms for now) and trades between their offices in Wall Street and Boston.
  • Early implementations were driven by an interest to maintain a sweet commission-earning relationship with Fidelity, and other large asset managers. This helped drive adoption of a common language that would ultimately serve the public interest by increasing competition and reducing commission rates from 25 - 40 basis points to between and less than one basis point today.
  • Whilst in this case the public good benefited from private benefits driving standardisation this is often not the case. Either standards that would benefit the public are not associated with private benefits and so do not get implemented or they are fall foul of industry bickering as each non aligned private benefit argues their corner. This reinforces the point about all stakeholders having a voice in standardisation.
  • In Financial Services, as in most business endeavours, there is also a cultural deference in that everyone defers to money, particularly to the payers of large fees. In some cases, this deference can serve the public interest, but in many cases, it does not. Fidelity’s pivotal role in establishing FIX illustrates this.
  • FIX is maintained by FIX Protocol Limited, FPL, a not for profit company that manages the specification. FPL is funded by market participants voluntarily becoming members and their membership fees are used to promote the protocol.
  • Because FIX is an open protocol that can be used by anyone, FIX suffers from the free rider problem. The free rider problem is that firms do not contribute to something can still benefit from it. 

Other interesting origin stories include:

The origins of High Frequency Trading. Image from Ghost in the Shell (via IMGUR)

The origins of High Frequency Trading. Image from Ghost in the Shell (via IMGUR)

I started my Wall Street career in October, 2008 - just as Lehman Brothers was unravelling and global finance was having an existential crisis of faith.

During that time, Big finance and Insurance were the only sectors willing to sponsor the H1-B work visa on the east coast. Joining the financial industry was more, a function of need than choice. 

3 years later,  in October 2011, I witnessed the Occupy movement from the perches of the trading floor.  The memories during those weeks of protest were surreal.

Trading floors are full of screens. The Screen:Person ratio is at least 7:1 in a work area dominated by the lack of natural light and about 500 high-earning individuals under one roof all united by a common purpose - to make gobs of $$$$ or preserving the ability to make gobs of $$$$.

While CNBC was broadcasting scenes of protesters camping out in Zuccotti Park, we were actively normalizing our views of Bryant Park.

Image of Trading Floor via TreeHugger

Image of Trading Floor via TreeHugger

Up until that point, working on Wall St. symbolized to me, a great meritocracy of sorts. The frenzy of the Occupy movement and public radio coverage of the financial crisis were incredible learning opportunities to better understand and educate myself of the historical context, I was a participant in and enabler of. Over the next 3 years, I began honing in on a simple idea:

What if I could re-purpose my tech skills in the efficient movement of financial data to the efficient movement of public data?

Marketplace was a great learning resource during this time.

Wall street does technology well because the proverbial carrot is cold hard cash (period). Wall street gets stuff done because they throw gobs of money at tech problems. Gobs of money is a talent-magnet. Problems get solved even if it means the brute-force of passing cables through literal mountains or being petty enough to move your servers, a few feet closer to the market exchange. Any solution to such problems leads to 10x gobs of money.

This is the core script. A core function of this script, I have begun to realize are Information Exchange protocols,  jargon for being able to share standardized data or speak a common language, at some considerable scale.  

Their adoption, use and maintenance for local public services has never been more apparent. The same forces that allow Wall Street, Uber & Airbnb operate and make gobs of $$$ can be deployed for the delivery of public services and preserve gobs of public trust in public institutions.  

Consider these examples:

Most of these protocols were developed in the late '80s and matured in the 90s to facilitate the friction-less and inter-operable movement of data that facilitated communications and locomotion. These protocols form the digital bedrock to reduce the marginal cost of delivering data of economic  value.

What exists for Public services then? 

Consider Energy, Transportation, Healthcare and Water Conservation:

For Energy

The Open Access Same-Time Information System (OATI), a product of the Energy Policy Act of 1992 that led to, amongst other things, ENRON.

ENRON treated the flow of energy the same way Goldman Sachs treats the flow of money. ENRON was great at developing applications using OATI and made billions while also woefully subverting public infrastructure and millions of Californians in the process.

CAISO or California Independent Systems Operator took over when ENRON imploded and are essentially the New York Stock Exchange equivalent to maintaining California's energy marketplace.

CAISO currently maintains the OATI protocol. CAISO is a non-profit.

(Video) Time for Public services to ASK WHY?

Enron Trading Floor

Enron Trading Floor

The CAISO Control Room - Look Familiar?

The CAISO Control Room - Look Familiar?

For Healthcare

There is the Electronic Medical Record (EMR) and Health Information Exchanges that aim to the be the unified information standard and reduce the marginal cost of exchanging patient data for better health outcomes. Many local Health Information Exchanges are setup as non-profits and work closely with their state and local governments to adopt unified information exchange protocols.

Vermont Information Technology Leaders (VITL) is a canonical example here.

Vermont Information Technology leaders is a private non-profit that works closely with the State of Vermont to implement a Healthcare Information Exchange

Vermont Information Technology leaders is a private non-profit that works closely with the State of Vermont to implement a Healthcare Information Exchange

For Transportation

Google brought us the General Transit Feed Specification (GTFS) which happened to be a side project and many consider to have empowered various open data and civic tech movements.

The clear role and opportunity for government to get it right w.r.t to Connected & Autonomous vehicles (CAVs) is to work towards an Information exchange protocol around V2C, V2I data formats and create entities that act as unbiased arbiters of digital infrastructure to allow the next generation of mobility.

via Paul North, the New Yorker

via Paul North, the New Yorker

...and finally Water.

Interestingly, back in the early '00s, ENRON executives planned to "ASK WHY?" for water markets and created Azurix to begin privatizing global water markets. Things went thankfully south for them  else Frank Herbert's Dune would not be just another cliquey sci-fi fiction novel.

"To Save California, read Dune", via Nautilus

"To Save California, read Dune", via Nautilus

There does exist an "Open and Transparent" opportunity for California's Water Data via a recent piece of legislation calling for the creation of an Integrated and Open Water Data platform that amongst other benefits, promotes "openness and interoperability of water data" and "making information accessible, discoverable, and usable by the public can foster entrepreneurship, innovation, and scientific discovery."

Projects like the Open Water Rate Specification (OWRS) are just being started that would enable water stakeholders exchange data seamlessly and reduce the marginal cost of sharing and delivering water data.


Where glossy investment banks got together and leveraged advances in technology to benefit themselves, local water utilities are starting to do the same to benefit California's water system.


A small team of purpose-driven public technologists, leveraging advances in low-cost device, data and decision-making and the right kind of support is all it takes to build and maintain public, digital infrastructures.

The coordination and technology talent needed to pull these feats of public information engineering are, in spirit, similar to constructing the bridges, highways that we all pay for via our taxes. 

The eventual answer may lie in a digital equivalent to the Army Corps of Engineers . To build and maintain the next wave of digital infrastructures to power 21st century public service delivery.

The California Data Collaborative ("CaDC") is a coalition of water utilities that have pioneered a new data infrastructure non-profit to support water managers in meeting their reliability objectives and serve the public good.

The California Data Collaborative ("CaDC") is a coalition of water utilities that have pioneered a new data infrastructure non-profit to support water managers in meeting their reliability objectives and serve the public good.

The ARMY Corps of Engineers &nbsp;one of the world's largest  public &nbsp;engineering, design, and  construction management &nbsp;agencies. Although generally associated with dams,&nbsp; canals &nbsp;and  flood protection &nbsp;in the United States, USACE is involved in a wide range of  public works &nbsp;throughout the world. The Corps of Engineers provides outdoor recreation opportunities to the public, and provides 24% of U.S.&nbsp; hydropower &nbsp;capacity.

The ARMY Corps of Engineers  one of the world's largest public engineering, design, and construction management agencies. Although generally associated with dams, canals and flood protection in the United States, USACE is involved in a wide range of public works throughout the world. The Corps of Engineers provides outdoor recreation opportunities to the public, and provides 24% of U.S. hydropower capacity.

Varun Adibhatla


Print Friendly and PDF

The value of GIS in Big Data and AI to empower civic decision makers

Print Friendly and PDF

LA Metro aims to "ease" traffic. Why aren't we trying to end it?

Like many native Southern Californians, I’ve endured our infamous traffic (and unending jibes from out of town visitors) as long as I can remember.  Unlike most, however, I grew up in a profoundly technocratic household and spent my childhood dinner table conversations debating demographic projections and arguing how best to manage our region's infrastructure.

ARGO Street Quality Identification Device (“SQUID”) is very much in that pioneering, technocratic spirit and aims to improve how basic public services like street maintenance are delivered.  The uniquely affordable sensor measures street roughness and leverages computer vision to determine street quality, the same underlying technology powering self driving cars.

This agile and easily implemented technology can help digitize street inspections as a starting point to automated road detection and supports self driving cars by supporting well maintained and easily identifiable streets.  It also shows that computer vision isn’t magic and offers an intuitive path to operationalizing this technology in roads today.  

Recently returning to Southern California from that world to take the lead on a big water data project, the recent triumphalism surrounding the recent completion of the Expo line to Santa Monica has struck me as particularly odd.  Sure I’ve loved riding my beloved gold line, and I’ll enjoy the novelty of taking the train to the beach on a weekend when the excitement dies down.  

Yet light rail only serves approximately 1% of transportation trips in Southern California, and further only 25% of transit trips occur via train (the vast majority is on buses).  Shouldn’t our government aspire to provide more public benefit than these expensive inelastic trains that only serve as functional daily transportation for a small slice of Southern California’s sprawling metropolis?  Los Angeles is a dynamic city and deserves better than rigid, fixed transport.

LA Metro, however, plans to spend $42 billion over the next forty years on transit projects to fund bus terminals and a dramatic expansion of LA’s light rail system.  But even the biggest dreams of LA Metro’s light rail expansion don’t come close to the same coverage as our historic red cars, which extended all the way to Redlands and Newport Beach as well as providing dramatically greater breadth within LA.

LA's current light rail plan on the left and historic red car system on the right.

Rebuilding a portion of that red car system won’t solve our traffic problems; instead as the signs say “LA Metro eases traffic.”  So why exactly are we spending billions of dollars on a plan that doesn’t even aim to solve the problem?  The strangest thing about LA Metro’s plan is that “innovation” section discusses streetcars and circulator projects rather than the obvious: self driving cars.  

Autonomous vehicles already putt around Palo Alto and have driven millions of miles.  The tragedy though is that despite LA Mayor Garcetti’s admirable goal of making LA “the First City to Do Autonomous Vehicles Right,” LA Metro’s $140 billion plan doesn’t even mention the technology.  And as Andrew Ng, former Google Brain researcher and currently Chief Scientist at Baidu, has articulated, self driving cars require regulation and infrastructure investment:

“Autonomous driving's biggest problem is addressing all the corner cases--all the strange things that happen once per 10,000 or 100,000 miles of driving. Machine learning is good at getting your performance from 90% accuracy to maybe 99.9%, but it's never been good at getting us from 99.9% to 99.9999%. I think it is more promising to start with a different goal: A shuttle/bus that can only drive one bus route or just in a small region.”

That approach is perfect for Southern California, which is lacks a central hub like Manhattan in New York and would benefit hugely from the flexibility autonomous vehicles afford.  Imagine if rather than 40 person buses, you had fleets of 6 person pods picking up passengers precisely where they are using ridesharing technology similar to lyft or uber.  That would solve the last mile problem that traditionally cripples public transit.  

Furthermore, such pods could trail each other on freeways nearly bumper to bumper at high speeds, dragging off one another like cyclists in a peloton and achieving dramatically greater fuel efficiency.  The resulting lower per mile costs and potential public subsidy for reduced congestion and pollution could provide the economics to create near ubiquitous adoption, transforming Southern California’s urban landscape in the process.

Such pods would not need to park near high value business districts, freeing up Southern California’s scarce land for more valuable uses.  Southern California’s spread out suburbs might even switch from traffic choked sprawl into the network of garden cities it was always meant to be as old car parks and garages are transformed into new public spaces.

That vision will take some time to implement, but note the important barriers aren’t technological so much as technocratic.  Autonomous vehicles are already here.  Deploying them in the wilds of an urban environment will require regulation to ensure safety, just as the introduction of locomotion required different rules regarding walking on train tracks than those society was habituated to with horse paths.

The $42 billion LA Metro plans to spend over the next forty years could be repurposed into an integrated plan to prepare LA’s roadways for autonomous vehicles and pilot this pod approach in an LA neighborhoods or on dedicated freeway lanes.  Southern California’s highway network already provides solid coverage of the region and high speed autonomous pods offer the promise to leverage that existing investment rather than building soon to be stranded light rail assets.  

That modest pilot could be implemented in the immediate future with existing technology through a creative public private partnership with the many, many corporations lining up to be the world’s first to deploy self driving cars in a live urban environment.  Even if those cars were limited to dedicated lanes they could be supplemented with subsidized car networks like Uber or Lyft to solve the last mile problem.  And critically such a pilot puts us on a path toward potentially ending traffic across the region through ubiquitous pod deployments. Many will say such a change is impossible, but the advance of autonomous vehicle technology is inevitable.  

Los Angeles historically has been the city of the future, a place where dreams come true and the impossible is realized.  The city has fallen on hard times in recent decades, experiencing stagnating employment growth and steady traffic, but Southern California boasts a world beating cast of creative talent, attracting entertainers and engineers the world over with our famous weather and laid back lifestyle.  

Among those are many who believe, as Carey McWilliams once did and I still do, that the “most fantastic city in the world will one day exist in this region: a city embracing the entire region from the mountains to the sea.”  Rather than than simply easing traffic, why not flip Los Angeles’s infamous congestion on its head and be the first city to end it?

-Patrick Atwater

It remains unclear how "streetcar and circulator projects" qualify as innovation.

It remains unclear how "streetcar and circulator projects" qualify as innovation.

Print Friendly and PDF

Musings on how the blockchain might help streamline California's water markets

Perhaps unsurprising with the historic drought, California is currently working on several major initiatives to upgrade its water data systems.  The state launched a water "data innovation challenge" and Dodd's AB 1775 offers a key framework for integrating existing public statewide water data sources necessary for a water market.

On a more local level, our plucky band of civic data scientists helped launch a unique data collaborative across local water utilities working together to pioneer new water data infrastructure and lower the transaction costs associated with sharing sensitive yet critical to analyze customer water use data. 

At the UC Davis "Data Sharing and Security" workshop last week, one leading lawyer floated the idea of using the blockchain to help validate data sharing.  We've been kicking around similar ideas internally so thought we'd quickly flesh those out a bit.  

The Blockchain offers an intriguing framework for ensuring accurate coordination across disparate data sources and might be worth experimenting with for already public data like those water data sources tasked to be integrated to streamline water markets per Dodd's AB 1775. 

At the risk of sounding wildly idealistic, that could provide a foundation for governing water markets.  Consider the following metaphor from the digital triumphalists over at Tech Crunch:

Blockchains loosen up trust, which has been in the hands of central institutions (e.g., banks, policy makers, clearinghouses, governments, large corporations), and allows it to evade these old control points. For example, what if counterparty validation can be done on the blockchain, instead of by a clearinghouse?

An analogy would be when, in the 16th century, medieval guilds helped to maintain monopolies on certain crafts against outsiders, by controlling the printing of knowledge that would explain how to copy their work. They accomplished that type of censorship by being in cahoots with the Catholic Church and governments in most European countries that regulated and controlled printing by requiring licenses. That type of central control and monopoly didn’t last too long, and soon enough, knowledge was free to travel after an explosion in printing. To think of printing knowledge as an illegal activity would be unfathomable today. We could think of the traditional hold...
— Tech Crunch

The Blockchain as the metaphor above and the Economist notes is ultimately a trust machine so could imagine leveraging that technology at two potential levels in water markets where trust is critical:

  1. Cleaning raw data -- transforming raw unstructured data into a standardized, mutually agreed upon format and verifying it s accuracy.  Think the SWP flows or fish counts inventoried and tasked to be integrated into more accessible and more standardized data formats per the Dodd Bill.  Easy examples involve things like unit conversions, date time conventions and data formats.  Harder examples involve leveraging scientifically rigorous modeling of raw sensor readings to get accurate assessments of say water quality.  All that needs to be verified and could be streamlined rather quite epic-ly via the blockchain.
  2. Accounting in water markets -- could also imagine leveraging the blockchain to account for water transfers in water markets and would help avoid the bureaucratic turf battles and overlapping jurisdictions that Tony Castalletto (cced) has discovered in his research of Ca's water data systems and aligns with my experience in water management.

Basically the blockchain could help power a "water data marketplace" to help ensure the seamless information flows that a water market requires.  Might also have broader implications for how public data gets parsed and utilized.  Anyway nothing to my knowledge like this exists in the world so could be fun to pioneer...




Print Friendly and PDF