SIGS DATACOM Fachinformationen für IT-Professionals

SOFTWARE MEETS BUSINESS:
The Conference for Software Architectures
Munich, 30 January - 03 February 2017

Session details

Talk: Fr 6
Date: Fri, 03.02.2017
Time: 09:00 - 16:00
cart

Agile Estimating, Benchmarking, and Release Planning

Time: 09:00 - 16:00
Talk: Fr 6

 

How do you compare productivity and quality you achieve with across the span of your projects, be they Agile, Waterfall, or Outsourced? Learn about quality and defect metric trends and how defect patterns behave on real projects. Working in pairs, calculate productivity metrics using the templates Michael Mah employs in his consulting practice. Leverage these metrics to make the case for changing to more agile practices and creating realistic project commitments within your organization.

Target Audience: CIOs, Directors, VPs, Software Engineering Managers, Organizational and Project Leadership
Prerequisites: A working knowledge of Agile
Level: Practicing


Extended Abstract
How do you compare productivity and quality you achieve with across the span of your projects, be they Agile, Waterfall, or Outsourced? Join Michael Mah to learn about quality and defect metric trends and how defect patterns behave on real projects. Learn how to use your own data to create project trends on productivity, time-to-market, and defect rates, showing you these metrics in action on retrospectives and release planning. In hands-on exercises, learn how to replicate these techniques in your company.
Working in pairs, calculate productivity metrics using the templates Michael employs in his consulting practice. Leverage these new metrics to make the case for changing to more agile practices and creating realistic project commitments within your organization. Take back new ways for communicating to key decision makers the value of implementing agile development practices.
This tutorial has been presented in various forms to highly rated reviews. It's been updated to reflect new case studies, and pairs well with the speaker submission. The essence of this tutorial was the methodology where we collected Agile Benchmark data from companies that were past OOP attendees. The results were presented in a past OOP Keynote.