The oldest businesses in the world sit at exchange points. I just asked: where's mine?
The oldest businesses in the world sit at exchange points. I just asked: where's mine?

The oldest businesses in the world sit at exchange points. I just asked: where's mine?

The oldest businesses in the world sit at exchange points. The merchant who set up shop at the mountain pass. The money changer at the port. The import broker who never grew a single crop but got rich on every harvest that crossed a border. The currency exchange at every international airport that has existed since commercial aviation began — not because they're clever, but because they're necessary. They sit where two systems meet and can't exchange without help. This isn't a new observation. Economists have written about it. Historians have documented it. Every trade route in human history has a version of it — the Silk Road had caravanserais, the spice trade had the Venetians, modern oil markets have their chokepoints. The pattern is ancient and consistent: wherever something valuable needs to cross from one system into another, a business grows at the seam. I'm not an economist or a historian. I'm a technical founder in Chicago with a dog named Moose who costs me hundreds of dollars a year in vet bills. But I've been obsessing over this pattern — specifically where it shows up in data. Because data has the same problem every physical good has ever had at a border crossing: it exists on one side and is needed on the other, and the crossing is harder than it should be. So I ran an experiment. I built a framework to find my version of the exchange point. Then I built a product to sit at it. This is what I learned.

Featured

The framework has to come before the product

Before writing a line of code, I ran a structured brainstorm. Not because it's a clever process, but because the alternative is building something nobody asked for — which is my most reliable failure mode.

The sequence is straightforward: problem first, customer second, competitive analysis third, defensibility test fourth, solution last. The order matters. Starting anywhere else is how you end up with something technically impressive that nobody uses.

I have a specific failure mode the framework is calibrated against: B2C gravitational pull. Every time I start with an open-ended question about what to build, my instinct defaults to a consumer product. It's faster to imagine, easier to prototype, and almost always harder to monetize than the version sitting right next to it.

Project Turnkey is the clearest example. I built a residential lease scanner to help tenants identify problematic clauses before signing. Clean problem. Real pain. I spent weeks on it. I never once considered the commercial angle — the property manager reviewing dozens of leases a month, the landlord trying to standardize terms across a portfolio, the businesses on the other side of the transaction who had the same problem at ten times the volume and would pay real money to solve it. The consumer version was crowded, hard to charge for, and dependent on users who had the problem once and moved on. The commercial version was sitting right there the whole time.

The framework exists to catch that pull before it costs three months.

One other anti-pattern worth naming: starting with the solution. The temptation, especially for technical founders, is to see an interesting technology and work backwards to a problem it could solve. That's how you end up building AI wrappers — impressive demos that collapse the moment the underlying model improves or a competitor ships the same thing with a better distribution channel. The framework forces the question back to the beginning every time: what's actually broken? Who has that problem? Are they already paying — in money, time, or stress — to work around it?

If the answer to that last question is no, stop. The problem isn't real enough.

Where I wanted to position relative to AI

The biggest paradigm shift in my lifetime isn't a specific product. It's that intelligence itself became cheap. Models that would have required a research team and a data center three years ago now run for fractions of a cent per call. Every industry is being touched. The question for a technical founder isn't whether to work with AI. It's where to stand relative to it.

Two failure modes I wanted to avoid.

The first: building directly on top of AI. Thin wrappers around foundation models have the defensibility of a sandcastle. The moment the underlying model improves — which it will, on a predictable schedule — the wrapper becomes redundant. You're not building a product. You're renting a feature.

The second: ignoring it entirely. The wave is real. Fighting it is not a strategy.

The third path — the one worth exploring — is finding where AI has to wait. Real-world data doesn't arrive structured, labeled, and ready to use. Someone has to collect it, organize it, make it legible before it's useful to anyone. That moment of collection and structuring is the gateway. That's where the exchange point sits.

The pattern I'd been studying in trade and energy markets applies directly here. The currency exchange doesn't produce anything. The import broker doesn't grow anything. They sit at the point where something valuable needs to cross from one system into another and make the crossing possible. In data terms: whoever owns the collection point owns what flows through it. That's the position I was looking for — not building intelligence, but owning the place where real-world signal becomes structured data that intelligence can actually use.

The first version was wrong. The problem wasn't.

It was the tail end of winter in Chicago. Not the sunny 70-degree days — those take care of themselves. Not the blizzards — those are easy to skip. The 45-degree windy days. The days that are technically walkable but where you'll talk yourself out of it if you don't have a reason. Moose is overweight. The app was accountability infrastructure for the marginal decision — the days where the excuse is available if you want it.

Twelve hours to a working Strava integration. GPS routes, pace, distance, walk history. Clean. Functional. And too narrow.

The first crack appeared when Amanda couldn't see her walks. This was supposed to be Moose's health record — a shared picture of his activity and wellbeing. Why did it only know about my half of it? Added co-owner access.

The second crack: something happened with Moose's stomach. Sent Amanda a text. Three weeks later at the vet, that text was impossible to find — buried in a thread, between a photo and a grocery question. The observation existed. It just lived nowhere legible.

The third crack: a new vet. Reconstructing six months of history from memory in an exam room. Vaccines — when was the last one? Labs — did we do bloodwork this year? Weight trend — is he up or down from last quarter? All of that existed somewhere. None of it was in the room.

The fourth crack: the manual entry problem. Even with the right structure in place — a health timeline, categories for every kind of event — getting a year's worth of history into the app meant logging every event by hand. Nobody is going to sit down and type out three years of vet records. So document extraction was added. Photograph a vaccine certificate, an old lab result, a discharge summary. The app pulls out the structured events automatically. An entire health history in under thirty seconds. The junction only works if the data can actually get there.

The exchange point wasn't walks. It was the entire fragmented record of a pet's life — vaccines, labs, medications, municipal registrations, behavioral observations, weight trends, vet notes — scattered across systems that were never designed to talk to each other. The vet keeps their records. The city keeps the license. The owner keeps the memory. Nothing translates. Nothing connects.

Nobody was sitting at the junction making the exchange possible.

What the exchange point actually looks like

Furvalis doesn't just store data. Storage is table stakes. The value is in the exchange — making it dramatically easier for health information to move between every layer that needs it.

Amanda logs a walk. I see it. The vet gets a generated summary before the appointment. The boarding facility gets a shareable care card with behavioral notes and current medications. The co-owner gets a notification when something new is logged. The health status dashboard shows what's due, what's current, what's overdue — without anyone having to reconstruct it from memory.

Each of those is an exchange between two layers that previously couldn't talk. The value compounds with every layer added — not because the product got more complex, but because each new exchange makes the existing data more useful to more people.

This is what I mean by the enrichment layer. The data isn't just stored — it's structured, summarized, made shareable, made legible to whoever needs it next. The vet who gets a generated summary before an appointment doesn't need to spend the first five minutes of a visit reconstructing history. The partner who gets a notification about a new health event doesn't need to be sent a screenshot of a text message. The boarding facility that gets a care card link doesn't need to call and ask about medications.

That's what makes the junction defensible. It's not just a pipe. It's a translator. And the more systems that depend on the translation, the harder it is to remove.

The caveat

This only works if the friction is real and people are already paying to work around it.

An exchange point in a system where nobody actually needs the exchange to happen is just infrastructure for a problem that doesn't exist. The pattern is ancient and consistent — but it only generates value when the crossing is genuinely necessary and genuinely hard.

The test I ran: are people already spending money, time, or stress on this? Moose costs hundreds — sometimes thousands — a year in vet bills. The appointments are happening. The records are being generated. The problem of those records being inaccessible at the moment they're needed is a real cost, paid in fumbled exam room conversations and repeated tests and information lost in text message threads.

The pain was established. The spending was already happening. The only question was whether a better junction made it easier to manage.

If the answer to "are people already paying to solve this, one way or another" is no — stop. The framework will tell you that before you write a line of code. That's the point of running it first.

Where the experiment stands

Furvalis is live. The exchange point is real. People have the pain. The data exists and wants to move.

What's still unproven: whether it's defensible at scale. Whether the B2B layer — vet clinic dashboards, pet insurance data licensing — fully validates the thesis. Whether owning the collection point is actually a moat or just a head start that a better-funded competitor closes in eighteen months.

That's what the next phase is for. The framework got me to the right starting point. The product is the test. The results will either confirm the thesis or break it — and either way, something useful comes out of it.

I'm not presenting this as a discovered truth. It's an observation I've been sitting with, a framework I built from it, and one product I'm using to find out whether any of it holds. The pattern is ancient. The application is mine. The verdict is still out.

If you're a founder trying to find your exchange point — not a feature to build, not a market to enter, but the junction in a system where data exists on one side and is needed on the other — that's the conversation worth having.

Book a call.