Amid all the panic about how artificial intelligence is rapidly replacing human work, we are hiding a dirty little secret. Humans are awful at breaking down goals into component parts. Anyone who has tried to use any type of project management software intuits this. Articulating clear, specific and manageable tasks is very hard.
Humans are inference driven, always integrating little bits of context and nuance. If your boss gives you a goal, the best path forward on it will depend on hundreds if not thousands of factors.
What’s your budget? What is your timeline? Who is on your team? Do you dislike someone? Want to impress another person? Is the goal to be fast? Is the goal to be good? Is the goal to be as good as possible as fast as possible with as little budget as possible? Trick question, quit your job if that last one true.
Knowing what we want, knowing the best path to achieving it, and knowing how the pursuit of those goals affect your family, friends, neighbors, enemies and adversaries, creates layers of decisions in a complex matrix of possibilities. This is easy for a machine to do if we’ve given them the right inputs, outputs, and parameters. Alignment isn’t all that easy.
I personally don’t believe most of us know what we want. Beyond a Giradian imitation of what our current culture deems valuable, and thus worthy of admiration, genuine desire is hard to pin down. And that makes it hard to have goals. Goals are required to have specific tasks. Specific preferences on how a task gets achieved narrow it down even further.
If I could simplify down every detail and desire and nuance and preference set and also align them with my wider goals, ambitions, and critical paths to achieving them you know I’d have it all organized on some kanban board. And if the AI can extract that from me and turn all my goals into discrete assignments and task chunks I’d happily go full Culture and let them run my entire life.