Reflections on AONW 2026
- domingo
- 3 hours ago
- 4 min read
This year, I had the incredible opportunity to attend Agile Open Northwest 2026 in Portland, OR. Both the people and the atmosphere make AONW my favorite conference to attend by far, and I wanted to take some time to reflect on what I learned this year.
On the progress of AI
As one might expect at a tech conference, AI was on everyone’s mind. As someone relatively new to the tech community, I came in with the main goal of hearing what the community was saying about it.
Last year, the conversations sounded a lot like “this is what AI can do now.” I remember sitting in a room watching Claude fumble its way through solving simple algorithmic problems. This year, the conversation had changed. People were talking about writing skills for the AI to employ, how to orchestrate multiple AI agents to perform a task, and which parts of a task should be performed sequentially versus in parallel.
At one point, I watched a team of experienced professionals go head-to-head with Claude while refactoring a poorly written Kata. It was awe-inspiring to watch the teamwork involved. It required a level of technical expertise and deliberate communication that I aspire to. In the end, we all agreed that the human-refactored repository was superior to Claude’s output- but the exercise opened many of our eyes to the effectiveness of the tool we’re working with.
The discussion eventually led us to a deeper question:
what has actually changed about software development, and what hasn’t?
In another discussion, someone compared AI to a smart car. In some newer Tesla models, the driver doesn’t even need to use a gear shifter; the car infers what the user intends based on sensor data. If a wall is behind the car when the driver presses the gas, it moves forward. If a child runs into the road, the car stops automatically.
But the car’s purpose remains the same: transportation.
If we think about AI like a smart car, a similar question appears: if the machine can do the task well, why should the human need to understand what it’s doing?
My answer ended up being surprisingly simple, and I’m honestly surprised it took me so long to arrive at it.
The one thing AI cannot change in our profession is responsibility and culpability.
Let’s return to the smart car analogy. If a self-driving car malfunctions and harms a pedestrian, the driver is still responsible. If the sensors fail and the car backs through a garage wall, it’s not the car that has to explain what happened- it’s the driver.
The same applies to AI.
If an AI agent writes code that leaves a codebase vulnerable to security threats, it is the developer’s responsibility to catch it. That developer is ultimately accountable for the oversight. If we allow AI to write all of our code without understanding or reviewing it, then how do we take responsibility for the systems we build? How do we respond when things go wrong?
Lastly, many people at the conference were concerned about the ethical use of AI and how we might use it to make the world better. My answer remains the same: if we are moving toward an AI-heavy future, then we must use it to build tools that bring restorative justice to those who have been neglected by our current systems.
Agile and the tools of teamwork
I also learned a great deal about how technical teams function and how experienced professionals approach leadership and workflow.
Many of the developers I look up to shared the tools and practices that work best on their teams. Hearing about how different teams structure communication, decision-making, and collaboration was incredibly valuable.
I also had the opportunity to facilitate a conversation about unconventional applications of Agile. In that session, I heard from a group of thoughtful and compassionate people who described how they bring Agile concepts into their everyday lives.
Some people talked about using Kanban boards to organize and gamify household tasks within their families. Parents shared how they encourage blameless communication with their children through affirmations designed to build self-empowerment.
My favorite example was an affirmation a parent shared with their child:
"Being talked to is not being in trouble.”
That simple phrase stuck with me, and it’s one I plan to use in my own life going forward.
My career goals and values
By far, the most common conversations I had at AONW were about careers: how people align their work with their values and what helps them be effective in their roles.
As someone still figuring out where my own career is headed, it meant a lot to sit down with seasoned professionals and hear their experiences. One thing that reassured me is that many of them had faced the same uncertainty at some point in their careers.
One conversation in particular stuck with me. I had the chance to share lunch with someone I heavily respect, and she left me with a statement that I haven’t stopped thinking about since:
“I don’t do what I do for money. I do it for autonomy.”
Hearing that from someone I look up to changed something in my brain. It helped me realize that what I value most in this field isn’t just solving technical problems, it’s the freedom to explore them.
Now I suppose I have a reading list ahead of me, because I intend to track down every book she recommended.
Lastly, I had several wonderful conversations about communication, leadership, and the human side of technical work. If you haven’t read it already, I highly recommend the article Empathy is a Technical Skill by Andrea Goulet. It captures so much of what I’ve been exploring in my own journey toward empathy and leadership, and it puts those ideas into words far better than I could.
Andrea, thank you again for taking the time to sit down and chat with me.
Closing Thoughts
Thank you again to everyone who made AONW 2026 such a fantastic conference. I genuinely had so much fun, and I’m leaving newly inspired, refreshed, and excited to try new things.
If you’re considering attending Agile Open Northwest 2027 in Seattle, I strongly encourage you to give it a shot.