In our recent webcast, three researchers came together for a panel conversation to challenge a widely held assumption in the business world: that speed and confidence are in tension. Bianca, Holly, and Bahar argued the opposite. When research is treated as infrastructure rather than a one-time deliverable, organizations can move faster and with more conviction.
The Hidden Cost of Moving Too Fast
The panel opened with a discussion on speed versus confidence, and Bianca set the tone with an example from their time at Indeed. When the company shifted its pricing model from pay-per-click to pay-per-completed application in 2021, early revenue and adoption metrics lit up immediately.
Over time, usage dropped, support tickets surged, and customers grew vocal about losing control over their spend. The press eventually framed it as a dark UI pattern, turning a promising product bet into a reputational problem requiring expensive fixes. Bianca was clear that the pricing change itself wasn’t the mistake but moving faster than validation could keep up with, without anyone modeling what those early signals would mean for long-term trust and retention.
Holly built on that point by describing what happens organizationally when an unvalidated signal goes unchecked. Marketing reframes its narrative, sales updates its decks, and leadership declares a win. By the time warning signals arise, the original assumption has long since stopped being tested. She called this “signal drift,” the moment when an organization’s internal story of success detaches from the reality customers are actually experiencing. Improved structure and frameworks to keep interpretation anchored as the organization moves are imperative to prevent signal drift.
Bahar added the psychological dimension: speed can become a bias in itself. When everything is moving fast, motion feels like progress. Teams become more focused on maintaining pace than on asking whether they are heading in the right direction. Validation has to be embedded in culture, not treated as a box to check at the end.
Research as Infrastructure, Not a One-Off Event
The panel’s second theme pushed further into research systemization where Holly drew an analogy from medicine. Great Ormond Street Hospital in London, a cardiac surgery team struggling with chaotic post-operative handoffs found an unlikely model for improvement: a Formula One pit crew. After inviting an F1 team to study their process, they redesigned the handoff with assigned roles, sequenced actions, and rehearsed choreography. Errors dropped and the hospital didn’t treat it as a one-time fix. They institutionalized the protocols and trained every new staff member in the same choreography.
The panelists emphasized that research functions need to make this same shift. When insights live in isolated slide decks, their value decays but embedded into the operating system of an organization, they compound. Bianca extended the point: structured inputs that connect customer experience, product performance, and internal priorities become the foundation for the AI models and systems that will support future decisions.
Rigorous Research Is the AI Readiness Plan
Bahar cited an MIT report finding that 95% of enterprise generative AI pilots failed to deliver measurable return on investment. The problem is rarely the model but what feeds the model since AI does not understand why something is happening – it only detects patterns. If those patterns come from weak or narrowly validated data, the system will scale the wrong signals faster and further than any human team could.
She illustrated this with a healthcare example: an AI system built to recommend cancer treatments performed well in the major centers whose data trained it, but broke down when deployed in other countries where patient needs and drug access differed significantly. The data and validation were too narrow to generalize.
Bianca connected this directly to research practice. Companies succeeding with AI are those already treating research as decision intelligence infrastructure, teams that have defined what their data means, connected signals to outcomes, and validated those relationships over time. AI doesn’t create structure but amplifies whatever structure already exists. Rigor is the language that lets human intent and machine logic understand each other. Clear frameworks today don’t just prepare organizations for automation; they protect the reasoning that makes systems accountable.
Slow Down to Speed Up
The organizations that will thrive in an AI-driven environment are not necessarily the ones moving fastest, but the ones that have built the most reliable systems underneath their speed. Research, when treated as a true system, stops being a cost center and starts becoming the compounding asset that makes everything else more accurate, accountable, and durable over time.
Register to watch the webcast replay here.