Skip to content

Applied AI

How to Lead AI Engineering Teams

Have you ever wondered why some teams seem to effortlessly deliver value while others stay busy but make no real progress?

I recently had a conversation that completely changed how I think about leading teams. While discussing team performance with a VP of Engineering who was frustrated with their team's slow progress, I suggested focusing on better standups and more experiments.

That's when Skylar Payne dropped a truth bomb that made me completely rethink everything:

"Leaders are living and breathing the business strategy through their meetings and context, but the people on the ground don't have any fucking clue what that is. They're kind of trying to read the tea leaves to understand what it is."

That moment was a wake-up call.

I had been so focused on the mechanics of execution that I'd missed something fundamental: The best processes in the world won't help if your team doesn't understand how their work drives real value.

In less than an hour, I learned more about effective leadership than I had in the past year. Let me share what I discovered.

The Process Trap

For years, I believed the answer to team performance was better processes. More standups, better ticket tracking, clearer KPIs.

I was dead wrong.

Here's the truth that surprised me: The most effective teams have very little process. What they do have is: - Crystal clear alignment on what matters - A shared understanding of how the business works - The ability to make independent decisions - A systematic way to learn and improve

Let me break down how to build this kind of team.

The "North Star" Framework

Instead of more process, teams need a clear way to connect their daily work to real business value. This is where the North Star Framework comes in.

Here's how it works:

  1. Define One Key Metric: Choose a single metric that summarizes the value you deliver to customers. For example, Amplitude uses "insights shared and read by at least three people."

  2. Break It Down: Identify the key drivers that teams can actually impact. These become your focus areas.

  3. Create a Rhythm:

  4. Weekly: Review input metrics
  5. Quarterly: Check relationships between inputs and your North Star
  6. Yearly: Validate that your North Star predicts revenue

  7. Make It Visible: Run weekly business reviews where leadership shares these metrics with everyone. Start manual before building dashboards - trustworthy data matters more than automation.

This framework does something powerful: it helps every team member understand how their work drives real value.

The Weekly Business Review

One of the most powerful tools in this framework is the weekly business review. But this isn't your typical metrics meeting.

Here's how to make it work: - Make it a leadership-level meeting that ICs can attend - Focus on building business intuition, not just sharing numbers - Take notes on anomalies and patterns - Share readouts with the entire team - Use it to develop a shared mental model of how the business works

Rethinking Team Structure

Here's another counterintuitive insight: how you organize your teams might be creating unnecessary friction.

Instead of dividing responsibilities by project, try dividing them by metrics. Here's why: - Project-based teams require precise communication boundaries - Metric-based teams can work more fluidly - It reduces communication overhead - Teams naturally align around outcomes instead of outputs

Think about it: When teams own metrics instead of projects, they have the freedom to find the best way to move those metrics.

Early Stage? Even More Important

I know what you're thinking: "This sounds great for big companies, but we're too early for this."

That's what I thought too. But here's what I learned: Being early stage isn't an excuse for throwing spaghetti at the wall.

You can still be systematic, just differently:

  1. Start Qualitative:
  2. Draft clear goals and hypotheses
  3. Generate specific questions to validate them
  4. Talk to customers systematically
  5. Document and learn methodically

  6. Focus on Learning:

  7. Treat tickets as experiments, not features
  8. Make outcomes about learning, not just shipping
  9. Accept that progress is nonlinear
  10. Build systematic ways to capture insights

  11. Build Foundations:

  12. Document your strategy clearly
  13. Make metrics and goals transparent
  14. Share regular updates on progress
  15. Create systems for capturing and sharing learnings

The Experiment Mindset

One crucial shift is thinking about work differently: - The ticket is not the feature - The ticket is the experiment - The outcome is learning

This mindset change helps teams focus on value and learning rather than just shipping features.

Put It Into Practice

Here are five things you can do today to start implementing these ideas:

  1. Define Your North Star: What's the one metric that best captures the value you deliver to customers?

  2. Start Weekly Business Reviews: Schedule a weekly meeting to review key metrics with your entire team. Start simple - even a manual spreadsheet is fine.

  3. Audit Your Process: Look at every process you have. Ask: "Is this helping people make better decisions?" If not, consider dropping it.

  4. Document Your Strategy: Write down how you think the business works. Share it widely and iterate based on feedback.

  5. Shift to Experiments: Start treating work as experiments to test hypotheses rather than features to ship.

The Real Test

The real test of whether this is working isn't in your processes or even your metrics. It's in whether every team member can confidently answer these questions:

  • "What should I be spending my time on today?"
  • "How does my work drive value for our business?"
  • "What am I learning that could change our direction?"

When your team can answer these without hesitation, you've built something special.

Remember: Your team members are smart, capable people. They don't need more process - they need context and clarity to make good decisions.

Give them that, and you'll be amazed at what they can achieve.

P.S. What would you say is your team's biggest obstacle to working this way? Leave a comment below.

SWE vs AI Engineering Standups

When I talk to engineering leaders struggling with their AI teams, I often hear the same frustration: "Why is everything taking so long? Why can't we just ship features like our other teams?"

This frustration stems from a fundamental misunderstanding: AI development isn't just engineering - it's applied research. And this changes everything about how we need to think about progress, goals, and team management. In a previous article I wrote about communication for AI teams. Today I want to talk about standups specifically.

The ticket is not the feature, the ticket is the experiment, the outcome is learning.

The right way to do AI engineering updates

Helping software engineers enhance their AI engineering processes through rigorous and insightful updates.


In the dynamic realm of AI engineering, effective communication is crucial for project success. Consider two scenarios:

Scenario A: "We made some improvements to the model. It seems better now."

Scenario B: "Our hypothesis was that fine-tuning on domain-specific data would improve accuracy. We implemented this change and observed a 15% increase in F1 score, from 0.72 to 0.83, on our test set. However, inference time increased by 20ms on average."

Scenario B clearly provides more value and allows for informed decision-making. After collaborating with numerous startups on their AI initiatives, I've witnessed the transformative power of precise, data-driven communication. It's not just about relaying information; it's about enabling action, fostering alignment, and driving progress.

A surprising reason to not list your consulting prices

As I've shared insights on indie consulting, marketing strategies, and referral techniques, a recurring question from my newsletter subscribers is about pricing. Specifically, many ask if they should lower their rates or make them public.

In this article, we'll delve into the counterintuitive reasons why listing your consulting prices might not be the best strategy, regardless of whether you're aiming to appear affordable or exclusive. We'll explore the potential drawbacks of transparent pricing, introduce more effective alternatives like minimum level of engagement pricing, and provide actionable strategies to help you maximize your value and earnings as a consultant.

Building on the foundation laid in my previous posts about building a consulting practice and using the right tools, this piece will add another crucial element to your consulting toolkit: strategic pricing.

Implementing Naturalistic Dialogue in AI Companions

Ever think, "This AI companion sounds odd"? You're onto something. Let's explore naturalistic dialogue and how it could change our digital interactions.

I've been focused on dialogue lately. Not the formal kind, but the type you'd hear between friends at a coffee shop. Conversations that flow, full of inside jokes and half-finished sentences that still make sense. Imagine if your AI companion could chat like that.

This post will define naturalistic dialogue, characterized by:

  1. Contextual efficiency: saying more with less
  2. Implicit references: alluding rather than stating
  3. Fragmentation: incomplete thoughts and imperfections
  4. Organic flow: spontaneity

We'll then examine AI-generated dialogue challenges and propose a solution using chain-of-thought reasoning and planning to craft more natural responses.

Art of Looking at RAG Data

In the past year, I've done a lot of consulting on helping companies improve their RAG applications. One of the biggest things I want to call out is the idea of topics and capabilities.

I use this distinction to train teams to identify and look at the data we have to figure out what we need to build next.

10 Ways to Be Data Illiterate (and How to Avoid Them)

Data literacy is an essential skill in today's data-driven world. As AI engineers, understanding how to properly handle, analyze, and interpret data can make the difference between success and failure in our projects. In this post, we will explore ten common pitfalls that lead to data illiteracy and provide actionable strategies to avoid them. By becoming aware of these mistakes and learning how to address them, you can enhance your data literacy and ensure your work is both accurate and impactful. Let's dive in and discover how to navigate the complexities of data with confidence and competence.

Data Flywheel Go Brrr: Using Your Users to Build Better Products

You need to be taking advantage of your users wherever possible. It’s become a bit of a cliche that customers are your most important stakeholders. In the past, this meant that customers bought the product that the company sold and thus kept it solvent. However, as AI seemingly conquers everything, businesses must find replicable processes to create products that meet their users’ needs and are flexible enough to be continually improved and updated over time. This means your users are your most important asset in improving your product. Take advantage of that and use your users to build a better product!

Unraveling the History of Technological Skepticism

Technological advancements have always been met with a mix of skepticism and fear. From the telephone disrupting face-to-face communication to calculators diminishing mental arithmetic skills, each new technology has faced resistance. Even the written word was once believed to weaken human memory.

Technology Perceived Threat
Telephone Disrupting face-to-face communication
Calculators Diminishing mental arithmetic skills
Typewriter Degrading writing quality
Printing Press Threatening manual script work
Written Word Weakening human memory

A feat of strength MVP for AI Apps

A minimum viable product (MVP) is a version of a product with just enough features to be usable by early customers, who can then provide feedback for future product development.

Today I want to focus on what that looks like for shipping AI applications. To do that, we only need to understand 4 things.

  1. What does 80% actually mean?

  2. What segments can we serve well?

  3. Can we double down?

  4. Can we educate the user about the segments we don’t serve well?

The Pareto principle, also known as the 80/20 rule, still applies but in a different way than you might think.