Skip to main content

Sprint Review Meetings

At the end of every sprint my team has a sprint review.  It’s been a bit of a tradition and I think
everyone agrees that we all got a lot out of it.  People share best practices, things to watch out for and generally check in with each other. To have a great review meeting you need to get people to open up. How do you create a comfort zone where people can speak up? It is the sticky issues that generally slow people down, annoy them and sometimes cause people to quit their job.

Here are a couple of tricks I’ve accumulated over my twelve years of management:

Ask people to write things that they consider “accelerators” and “blockers” on sticky notes.  Ask them to come up with at least 2 of each.  It will take a little effort on everyone’s part but it’s better to have a bunch of things that are minor rather than miss one major issue.  Somehow writing things on a sticky note and not adding your name to it makes people more comfortable than speaking up in front of everyone.  After everyone has their stickies, I ask people to randomly put them into two areas of the whiteboard.  This generally gets the group going with ideas for improvement and shared lessons.

Second tactic that I’ve employed is asking people what they learned in the last iteration.  I found that for technical folks it’s easier to say: “I learned that the build is really unstable” rather than “you know that our build system is total crap” in front of people who might actually be maintaining the system.

Last method, which I rarely use, but it could work for a distributed team is SurveyMonkey.  Any anonymous system where you can create a safe environment is a great way to get your team to talk.

How about your team?  How do you get opinions from everyone?  Get in touch with me via twitter @mikebz, post a comment on or find me in other ways.


Popular posts from this blog

SDET / QA Engineer Interview Checklist

After interviewing and hiring hundreds of engineers over the past 12+  years I have come up with a few checklists.  I wanted to share one of those with you so you could conduct comprehensive interviews of QA Engineers for your team. I use this checklist when I review incoming resumes and during the interview.  It keeps me from missing areas that ensure a good team and technology fit.  I hope you make good use of them.  If you think there are good questions or topics that I have missed - get in touch with me! SDE/T or QA Engineer interview checklist from Mike Borozdin If you like this checklist you might want to check out these posts: Emotional Intelligence in Software Teams   and  Good-bye manual tester, hello crowdsourcing!

Code versus Configuration

At Ethos we are building a distributed mortgage origination system and in mortgage there is a lot of different user types with processes that vary depending on geography.  One of our ongoing discussions is about how much of the logic resides in code vs. being in a workflow system or configuration.  After researching this topic for a bit, I have arrived at a conclusion that the logic should live outside of code very infrequently, which might come as a surprise to a lot of enterprise software engineers. Costs of configuration files and workflow engines First thing that I assume is true is that having any logic outside of the code has costs associated with it.  Debugging highly configurable system involves not only getting the appropriate branch from source control, you also need to make sure that the right configuration values or the database.  In most cases this is harder for programmers to deal with.  In many FinTech companies where the production data is not made readily acce

Intuitive Programming - Comments

Comments are a topic of vibrant discussion.  Ever since programmers could leave some text in the program that was ignored by the machine the debate started: “what’s a good comment, what’s a bad comment, why comment?” There are endless instructions to programmers that say many of the following things: 1) Describe your function! 2) Don’t write in the comment what you wrote in the code. 3) Tell people why you are doing what you are doing. What I think has been missing from this discourse is the audience for comments and through those audiences there is intent.  The code is being read, skimmed or analyzed by people and tools.  So what are the audiences and reading modes? 1) Maintaining and enhancing the code 2) Skimming through the entire module or file to figure out what the overall structure is 3) Reviewing the test files to check out the test coverage and edge cases 4) Seeing the docstrings of functions while being in a separate file altogether 5) Reviewi