The Open Garden Framework: A New Operating Model for Programmatic Advertising
Sarah Moss
April 22, 2026
9
minutes read
U.S. digital advertising keeps getting bigger, smarter, and more automated, yet for many marketers, it has never felt harder to control. In this article, we look at why the old platform-first model is breaking down and how the Open Garden framework offers a more practical way to run programmatic across a fragmented ecosystem.
The Open Garden framework has become a useful way to describe what many marketers already feel in practice: digital advertising has expanded across channels, platforms, data environments, and buying models much faster than the industry has developed a coherent way to run all of it together. U.S. digital ad revenue reached$258.6 billion in 2024, and IAB’s 2026 outlook says ad spend growth is expected to accelerate again in 2026. Growth is not the problem. Coordination is.
The gap is consequential. Organizations that added tools expecting greater control have frequently ended up with the opposite. Teams now work across DSPs, supply paths, identity approaches, privacy rules, commerce signals, CTV environments, and channel-specific reporting frameworks—all while performance pressure keeps rising. IAB says today’s media buyers are recalibrating around that pressure, while Deloitte notes that cross-platform audience intelligence remains fragmented and difficult to realize.
⚡ The buying problem in programmatic has largely been solved. The operating-model problem has not.
This article argues that the answer is not yet another platform layer. It is a better way to orchestrate what already exists. That is where the Open Garden advertising framework comes in.
Projected % change US ad spend YoY by channel (Source)
The shift: from platforms to orchestration
For years, programmatic strategy was framed as a platform choice. Which DSP should we use? Which SSP relationships matter most? Which channel mix should get the next tranche of budget? Those are still valid questions, but they are no longer sufficient.
The harder challenge now is how those pieces work together. A modern campaign may run across open web display, CTV, retail media, native, audio, and paid social adjacencies, while drawing on first-party data, modeled audiences, contextual signals, and measurement frameworks that do not naturally align. Meanwhile, IAB’s 2026 outlook says the market is shifting from AI experimentation toward AI as core infrastructure for campaign execution. That makes orchestration more important, not less.
In other words, the strategic question has changed from “Which platform should I buy through?” to “How do I coordinate planning, activation, supply, and measurement across a fragmented environment without losing control?”
Current AI adoption in advanced measurement (Source)
⚡ The challenge has moved from accessing media to coordinating the decisions that surround it across a fragmented system.
That is the shift from platforms to orchestration. It is also the reason an operating model matters.
What is the Open Garden framework
The Open Garden framework is best understood as an operating model, not a product category. Its purpose is to structure how planning, activation, optimization, and measurement work together across multiple buying and data environments with the advertiser’s KPI at the center.
AI Digital’s Open Garden, for instance, is built on a DSP-agnostic, AI-enhanced, client-first approach designed to give advertisers more control, transparency, and cross-platform visibility. It is meant to counter the constraints of walled-garden execution by keeping decision-making anchored to business outcomes rather than to any one platform’s incentives.
That distinction carries real weight. A true Open Garden AI Digital approach is not simply “use several platforms.” It is:
Plan against business outcomes, not platform defaults.
Select supply deliberately, not passively.
Use AI as an execution layer, not as decorative positioning.
Measure across environments consistently enough to make decisions with confidence.
When those pieces work together, marketers get something more valuable than convenience. They get a model for governing complexity.
This is where many definitions get slippery, so it is worth being direct.
The Open Garden framework is not a DSP. It is not a synonym for “multi-platform buying.” It is not just a collection of APIs, partners, and integrations stitched together after the fact. In other words, Open Garden is not a system for orchestrating how the ecosystem operates as a whole, rather as a tool in the stack.
That means it should not be confused with:
a tactical channel mix
a wrapper around several DSP seats
a one-off interoperability project
a generic managed service model
a shiny AI layer added on top of the same old reporting mess
A normal multi-platform setup can still be fragmented, manual, and opaque. An Open Garden advertising framework should do the opposite. It should make cross-platform execution more comparable, more governable, and easier to optimize against shared goals.
The current model breaks down because too much of the work still happens at the platform layer, while the business questions marketers need to answer sit above it.
A platform can tell you how something performed in that platform. It can even optimize aggressively inside its own environment. What it usually cannot do on its own is help you compare value cleanly across channels, reconcile exposure with outcomes across systems, or remove the operational waste caused by fragmentation.
Types of advanced measurement being used today (Source)
IAB’s Project Eidos was launched for exactly this reason. In announcing it, IAB said advanced measurement is more widely used than ever, yet confidence in results is eroding. According to IAB’s State of Data 2026 findings cited in that announcement, 60% to 75% of buy-side users say advanced measurement falls short on rigor, timeliness, trust, and efficiency, and none believe all paid channels are well represented in current MMMs.
% that say current approach doesn’t work very well (Source)
That is not a small technical inconvenience. It means teams are often doing all the labor of modern media without getting the confidence modern media was supposed to provide.
In day-to-day terms, the breakdown looks like this:
teams spend hours managing interfaces instead of improving strategy
channel reports arrive in different formats with different assumptions
supply quality is hard to compare across environments
optimization logic varies by platform
incrementality and attribution are discussed as if they were universal, when the underlying inputs are not
The result is a strange kind of maturity: more dashboards, more automation, more channel options and less clarity.
What fragmentation actually means
Fragmentation is often used as a catch-all complaint. It helps to make it more concrete.
Fragmentation does not simply mean “there are many channels.” It means the rules, logic, and data structures of those channels are different enough that coordination becomes expensive. Deloitte puts it plainly: as consumers move across social, streaming, linear TV, gaming, commerce, and live entertainment, those touchpoints are captured in disconnected systems, and cross-platform audience intelligence remains fragmented and difficult to realize.
So what does that mean operationally?
It means one platform defines reach one way, another defines outcomes differently, and a third makes supply visibility partial by design. It means audience logic does not travel cleanly. It means identity and privacy constraints reshape what can be matched, where, and how often. It means teams often end up comparing reports that look numerically neat but are not decision-ready.
Here is the real shape of fragmentation:
Buying logic differs. Auction mechanics, optimization controls, and inventory access vary by platform.
Supply access differs. Premium inventory is not exposed in the same way everywhere.
Data structures differ. Audience, exposure, and outcome data are rarely harmonized by default.
Measurement assumptions differ. Attribution windows, modeled conversions, and deduplication logic do not line up neatly.
Governance differs. Privacy, data rights, and reporting transparency are uneven across environments.
That is why fragmentation is not just a media planning issue but a governance issue.
Wasted spend is the cost everyone measures. Wasted organizational energy is the cost everyone absorbs.
IAB Tech Lab’s finalized Deals API specification, released in February 2026, describes deal workflows as something that still benefits from significantly reduced manual data entry and greater transparency into curated deals. That alone tells you a lot. If the industry is still building standards to reduce manual deal configuration and clarify who curated and sold a package, then the old workflow clearly carries substantial operational drag.
Put simply, fragmented programmatic creates a disproportionate amount of overhead relative to the portion of media budget it touches. It pushes teams toward:
spreadsheet reconciliation
manual deal setup and QA
supply-path guesswork
conflicting performance interpretations
repeated internal debates about whose numbers to trust
⚡ Fragmented buying models burn more than budget. Time, attention, and decision quality all diminish alongside it.
How Open Garden works
The Open Garden framework works as an operational system made up of four connected components: vendor-neutral architecture, curated supply strategy, AI-powered execution, and unified cross-channel measurement.
The point is not that each component is novel on its own. The point is that they are designed to work together.
Vendor-neutral architecture
Vendor-neutral architecture removes the default bias that creeps into platform-led execution. Instead of building plans around whichever platform is most familiar, politically convenient, or easiest to report on, the framework allows DSPs and supply paths to be chosen according to objective fit and business goals.
This matters because neutrality improves more than flexibility. It improves accountability. If one supply path, DSP, or channel combination is underperforming, the operating model should allow budget and emphasis to shift without unraveling the whole system.
Curated supply strategy
Curated supply is where theory starts to become economics.
AI Digital’s Smart Supply proposition centers on curated inventory, supply path optimization, transparency, and active filtering of low-value traffic before it reaches the client. At the same time, IAB Tech Lab’s Deals API now explicitly aims to reduce manual entry, clarify deal terms, and identify which parties were involved in curating and selling a package.
That is an important signal. Curation is no longer just a sales term. It is becoming something the industry is trying to standardize and make more transparent.
A strong curated supply strategy does three things:
Improves quality control by narrowing exposure to better inventory.
Reduces unnecessary tech-path complexity through SPO and cleaner deal structures.
Makes pricing and provenance easier to evaluate because the path is more visible.
AI-powered execution
In a serious operating model, AI should sit in the execution layer, not in the headline.
AI Digital’s Elevate is a good example here. It supports AI-assisted planning, budget allocation, optimization, forecasting, and cross-platform insight generation, while keeping human oversight in place. That is the right emphasis. The value is not “AI did it.” The value is that complexity can be handled faster and more consistently without surrendering strategic control.
The timing adds urgency. IAB's 2026 outlook describes a market transitioning from AI experimentation to AI as infrastructure, and IAB's State of Data 2026 findings show buy-side teams already expecting AI to deliver material improvements in measurement and productivity in the near term.
Time expected to scale AI adoption in advanced measurement (Source)
Unified cross-channel measurement
This may be the most important component of all, because without it, orchestration becomes guesswork.
Project Eidos is explicit about the need to replace today’s patchwork of channel-by-channel measurement with an interoperable approach built on shared constructs, consistent language, and clearer standards for MMM. That is almost a direct endorsement of the kind of measurement layer an Open Garden operating model needs.
Expecting every platform to report identically is unrealistic. Unified measurement provides something more practical: a framework that makes the differences between platforms governable, allowing reach, frequency, outcomes, attribution, and incrementality to be interpreted more consistently across channels.
How is this different from a typical programmatic setup?
A typical setup is often platform-first. An Open Garden setup is KPI-first.
That sounds simple, but it changes nearly everything.
In a typical model, planning starts with available platforms, inherited partner structures, and historical channel habits. In an Open Garden model, planning starts with what the business is trying to achieve, then works backward into architecture, supply, activation, and measurement choices.
In practice, the difference looks like this:
typical setup: optimize inside platforms
Open Garden setup: optimize across the system
typical setup: accept fragmented reporting
Open Garden setup: build for comparability
typical setup: treat supply as inventory access
Open Garden setup: treat supply as a strategic lever
typical setup: use AI tactically
Open Garden setup: use AI operationally, with guardrails
typical setup: measure after the fact
Open Garden setup: make measurement part of execution design
⚡ That is why the Open Garden framework is better described as an advertising governance framework or cross-platform governance framework than as a media-buying trick.
What changes for marketers
For marketers, the most meaningful change is not abstract flexibility but practical relief.
A better operating model should reduce the amount of time teams spend reconciling systems and increase the amount of time they spend making decisions. It should make trade-offs more visible. It should expose platform bias more clearly. It should improve confidence in what to scale, what to cut, and what to test next.
That creates several concrete benefits:
simpler decision-making
clearer supply accountability
less manual coordination
more transparent optimization
better alignment between media activity and business KPIs
It also changes the marketer’s role. Instead of being trapped in platform administration, the team can spend more time on governance, prioritization, and growth logic.
The reason this model works now is that it is built for conditions that are unlikely to disappear.
Privacy rules are not going away. Channel sprawl is not going away. Clean rooms, first-party data matching, and consent signaling are not side stories anymore. They are part of the infrastructure.
IAB Tech Lab describes the Global Privacy Platform as a protocol and set of APIs for signaling user privacy consent and choice through the digital advertising supply chain, with a channel-agnostic architecture designed to adapt to changing regulation. It also describes PAIR as a privacy-centric approach that enables advertisers and publishers to reconcile first-party data without relying on third-party cookies, including interoperability between data clean rooms.
Those developments support the broader point: the future is not a return to a simpler stack. It is a more interoperable one.
That is why the Open Garden framework is relevant. It does not depend on one dominant platform, one identity method, or one reporting worldview. It is designed to stay adaptable as channels, privacy rules, and measurement standards evolve.
Conclusion: A new way to run programmatic
The industry does not need more tools in isolation. It needs a better way to connect them. That is the central case for the Open Garden framework, and it is exactly how the original article brief frames the conclusion.
Programmatic advertising is not short on access, automation, or inventory. What it lacks is a durable operating model for coordinating those assets across a fragmented ecosystem. The Open Garden advertising framework offers a practical answer: neutral architecture, curated supply, AI-assisted execution, and cross-channel measurement working as one system instead of four separate conversations.
For marketers, that means fewer black boxes, cleaner governance, and more room to make performance decisions based on business reality rather than platform convenience.
AI Digital positions its own approach around that combination of DSP-agnostic execution, transparency, curated supply, and AI-powered optimization, delivered through managed services, Smart Supply, and Elevate. For brands trying to operate across CTV, programmatic, and multi-channel performance without becoming captive to fragmented systems, that is a useful place to start the conversation. Why not get in touch?
Blind spot
Key issues
Business impact
AI Digital solution
Lack of transparency in AI models
• Platforms own AI models and train on proprietary data • Brands have little visibility into decision-making • "Walled gardens" restrict data access
• Inefficient ad spend • Limited strategic control • Eroded consumer trust • Potential budget mismanagement
Open Garden framework providing: • Complete transparency • DSP-agnostic execution • Cross-platform data & insights
Optimizing ads vs. optimizing impact
• AI excels at short-term metrics but may struggle with brand building • Consumers can detect AI-generated content • Efficiency might come at cost of authenticity
• Short-term gains at expense of brand health • Potential loss of authentic connection • Reduced effectiveness in storytelling
Smart Supply offering: • Human oversight of AI recommendations • Custom KPI alignment beyond clicks • Brand-safe inventory verification
The illusion of personalization
• Segment optimization rebranded as personalization • First-party data infrastructure challenges • Personalization vs. surveillance concerns
• Potential mismatch between promise and reality • Privacy concerns affecting consumer trust • Cost barriers for smaller businesses
Elevate platform features: • Real-time AI + human intelligence • First-party data activation • Ethical personalization strategies
AI-Driven efficiency vs. decision-making
• AI shifting from tool to decision-maker • Black box optimization like Google Performance Max • Human oversight limitations
• Strategic control loss • Difficulty questioning AI outputs • Inability to measure granular impact • Potential brand damage from mistakes
Managed Service with: • Human strategists overseeing AI • Custom KPI optimization • Complete campaign transparency
Fig. 1. Summary of AI blind spots in advertising
Dimension
Walled garden advantage
Walled garden limitation
Strategic impact
Audience access
Massive, engaged user bases
Limited visibility beyond platform
Reach without understanding
Data control
Sophisticated targeting tools
Data remains siloed within platform
Fragmented customer view
Measurement
Detailed in-platform metrics
Inconsistent cross-platform standards
Difficult performance comparison
Intelligence
Platform-specific insights
Limited data portability
Restricted strategic learning
Optimization
Powerful automated tools
Black-box algorithms
Reduced marketer control
Fig. 2. Strategic trade-offs in walled garden advertising.
Core issue
Platform priority
Walled garden limitation
Real-world example
Attribution opacity
Claiming maximum credit for conversions
Limited visibility into true conversion paths
Meta and TikTok's conflicting attribution models after iOS privacy updates
Data restrictions
Maintaining proprietary data control
Inability to combine platform data with other sources
Amazon DSP's limitations on detailed performance data exports
Cross-channel blindspots
Keeping advertisers within ecosystem
Fragmented view of customer journey
YouTube/DV360 campaigns lacking integration with non-Google platforms
Black box algorithms
Optimizing for platform revenue
Reduced control over campaign execution
Self-serve platforms using opaque ML models with little advertiser input
Performance reporting
Presenting platform in best light
Discrepancies between platform-reported and independently measured results
Consistently higher performance metrics in platform reports vs. third-party measurement
Fig. 1. The Walled garden misalignment: Platform interests vs. advertiser needs.
Key dimension
Challenge
Strategic imperative
ROAS volatility
Softer returns across digital channels
Shift from soft KPIs to measurable revenue impact
Media planning
Static plans no longer effective
Develop agile, modular approaches adaptable to changing conditions
Brand/performance
Traditional division dissolving
Create full-funnel strategies balancing long-term equity with short-term conversion
Capability
Key features
Benefits
Performance data
Elevate forecasting tool
• Vertical-specific insights • Historical data from past economic turbulence • "Cascade planning" functionality • Real-time adaptation
• Provides agility to adjust campaign strategy based on performance • Shows which media channels work best to drive efficient and effective performance • Confident budget reallocation • Reduces reaction time to market shifts
• Dataset from 10,000+ campaigns • Cuts response time from weeks to minutes
• Reaches people most likely to buy • Avoids wasted impressions and budgets on poor-performing placements • Context-aligned messaging
• 25+ billion bid requests analyzed daily • 18% improvement in working media efficiency • 26% increase in engagement during recessions
Full-funnel accountability
• Links awareness campaigns to lower funnel outcomes • Tests if ads actually drive new business • Measures brand perception changes • "Ask Elevate" AI Chat Assistant
• Upper-funnel to outcome connection • Sentiment shift tracking • Personalized messaging • Helps balance immediate sales vs. long-term brand building
• Natural language data queries • True business impact measurement
Open Garden approach
• Cross-platform and channel planning • Not locked into specific platforms • Unified cross-platform reach • Shows exactly where money is spent
• Reduces complexity across channels • Performance-based ad placement • Rapid budget reallocation • Eliminates platform-specific commitments and provides platform-based optimization and agility
• Coverage across all inventory sources • Provides full visibility into spending • Avoids the inability to pivot across platform as you’re not in a singular platform
Fig. 1. How AI Digital helps during economic uncertainty.
Trend
What it means for marketers
Supply & demand lines are blurring
Platforms from Google (P-Max) to Microsoft are merging optimization and inventory in one opaque box. Expect more bundled “best available” media where the algorithm, not the trader, decides channel and publisher mix.
Walled gardens get taller
Microsoft’s O&O set now spans Bing, Xbox, Outlook, Edge and LinkedIn, which just launched revenue-sharing video programs to lure creators and ad dollars. (Business Insider)
Retail & commerce media shape strategy
Microsoft’s Curate lets retailers and data owners package first-party segments, an echo of Amazon’s and Walmart’s approaches. Agencies must master seller-defined audiences as well as buyer-side tactics.
AI oversight becomes critical
Closed AI bidding means fewer levers for traders. Independent verification, incrementality testing and commercial guardrails rise in importance.
Fig. 1. Platform trends and their implications.
Metric
Connected TV (CTV)
Linear TV
Video Completion Rate
94.5%
70%
Purchase Rate After Ad
23%
12%
Ad Attention Rate
57% (prefer CTV ads)
54.5%
Viewer Reach (U.S.)
85% of households
228 million viewers
Retail Media Trends 2025
Access Complete consumer behaviour analyses and competitor benchmarks.
Identify and categorize audience groups based on behaviors, preferences, and characteristics
Michaels Stores: Implemented a genAI platform that increased email personalization from 20% to 95%, leading to a 41% boost in SMS click through rates and a 25% increase in engagement.
Estée Lauder: Partnered with Google Cloud to leverage genAI technologies for real-time consumer feedback monitoring and analyzing consumer sentiment across various channels.
High
Medium
Automated ad campaigns
Automate ad creation, placement, and optimization across various platforms
Showmax: Partnered with AI firms toautomate ad creation and testing, reducing production time by 70% while streamlining their quality assurance process.
Headway: Employed AI tools for ad creation and optimization, boosting performance by 40% and reaching 3.3 billion impressions while incorporating AI-generated content in 20% of their paid campaigns.
High
High
Brand sentiment tracking
Monitor and analyze public opinion about a brand across multiple channels in real time
L’Oréal: Analyzed millions of online comments, images, and videos to identify potential product innovation opportunities, effectively tracking brand sentiment and consumer trends.
Kellogg Company: Used AI to scan trending recipes featuring cereal, leveraging this data to launch targeted social campaigns that capitalize on positive brand sentiment and culinary trends.
High
Low
Campaign strategy optimization
Analyze data to predict optimal campaign approaches, channels, and timing
DoorDash: Leveraged Google’s AI-powered Demand Gen tool, which boosted its conversion rate by 15 times and improved cost per action efficiency by 50% compared with previous campaigns.
Kitsch: Employed Meta’s Advantage+ shopping campaigns with AI-powered tools to optimize campaigns, identifying and delivering top-performing ads to high-value consumers.
High
High
Content strategy
Generate content ideas, predict performance, and optimize distribution strategies
JPMorgan Chase: Collaborated with Persado to develop LLMs for marketing copy, achieving up to 450% higher clickthrough rates compared with human-written ads in pilot tests.
Hotel Chocolat: Employed genAI for concept development and production of its Velvetiser TV ad, which earned the highest-ever System1 score for adomestic appliance commercial.
High
High
Personalization strategy development
Create tailored messaging and experiences for consumers at scale
Stitch Fix: Uses genAI to help stylists interpret customer feedback and provide product recommendations, effectively personalizing shopping experiences.
Instacart: Uses genAI to offer customers personalized recipes, mealplanning ideas, and shopping lists based on individual preferences and habits.
Medium
Medium
Share article
Url copied to clipboard
No items found.
Subscribe to our Newsletter
THANK YOU FOR YOUR SUBSCRIPTION
Oops! Something went wrong while submitting the form.
Questions? We have answers
Do I still use walled gardens?
Yes. Open Garden is not an argument for refusing to use walled gardens. It is an argument against letting them define your entire operating model. The point is to use them where they make sense while preserving cross-platform control, comparability, and governance.
Does this require new tools or integrations?
Not always. In many cases, the bigger change is operational, not purely technical. Open Garden is about structuring how the existing ecosystem works together. Some implementations may add tools or integrations, but the model itself is not dependent on a single new platform.
How does measurement work across different platforms?
It works by building shared measurement logic above platform-level reporting. That does not eliminate all channel differences, but it makes them more governable. Industry efforts such as Project Eidos show why this matters: current measurement remains too patchworked and inconsistent for many buyers.
Do I lose control by relying on AI?
Not if AI is used properly. In an Open Garden model, AI supports execution—forecasting, allocation, optimization, and pattern detection—but strategic oversight remains human-led. AI Digital’s own positioning for Elevate reflects that balance.
Is this only relevant for large advertisers?
No. Larger advertisers may feel the pain sooner because complexity scales with spend and channel count, but smaller or mid-sized teams can also benefit. Fragmentation affects anyone trying to compare performance, manage supply quality, and keep governance intact across multiple environments.
How long does it take to implement Open Garden?
There is no universal timeline because this is not a plug-and-play widget. Implementation depends on how fragmented the current setup is, how mature measurement is, what supply relationships already exist, and how much internal alignment is needed. The best way to think about it is as an operating-model shift that can be phased rather than a single switch flipped overnight.
Have other questions?
If you have more questions, contact us so we can help.