Bottom-Up Software Effort Estimation Technique
Last time, we explained that even though estimating software project timelines is tough, we should do it anyway. We also broadly discussed some of the common estimation techniques. With that background, we want to go into further detail about the bottom-up software effort estimation technique that works for us. A critical aspect any estimation technique should capture is time and uncertainty. An estimate that only has time implies a high degree of certainty. This generally is not true for most of the projects. Here it goes : For details, read on: It is tough to give an accurate estimate to a complex task. Therefore, it is recommended that you break into small chunks. That will remove the number of uncertain variables. We use the following Sizes: Complexity Time Small 1 day Medium 3 days Large 1 Week Extra-Large 2 Weeks Your estimation gets accurate only when you are very granular in mapping and recording the hours that go towards a project’s completion. The whole point is to use real wall clock hours and days and idealized “programmer hours.” We should not be overly optimistic here. So if something might take 3 days but you might get it done in 1 day because you got lucky should still be quoted as medium. On the other hand, don’t be pessimistic by quoting that same task as large to cover your butt. In an ideal condition, your estimation mostly comprises small and medium tasks as there are few large and might be none extra large. However, you need not to do this in one fell swoop. The bottom-up software effort estimation technique advocates that you can refine the estimate later. A good software effort estimation technique will capture the uncertainty as well. “20 to 30 days” is a very different estimate compared to “5 to 45 days” even though both have the same mid-point estimate as 25 days. It is expected to capture the expected-case vs. best-case scenario. Uncertainty Level Multiplier Low 1.1 Moderate 1.5 High 2.0 Extreme 5.0 You can have a different multiplier value. That is quite possible. But defining and sticking to a system helps you to come up with quite accurate quotes. However, you will like to have more low and moderate uncertainty levels and very few high or extreme uncertainty estimates. How to arrive at the worst-case scenario estimates? The groundwork you have accomplished in your software effort estimation technique in the earlier steps will simplify this. Let me explain with an example. Task Complexity Uncertainty Expected Worst-Case Creating the filer Medium Moderate 3 days 4.5 Applying the Automation rule Large High 1 week. 2 weeks Showing an error message Small Low 1 day 3.3 days Integration with Salesforce Extra- Large High 2Weeks 10 Weeks. Is such a wide range acceptable? If not, then this step steps in. The range is so high because you have a few extreme, large, uncertain-level projects. You should review those along with your colleagues. Try to brainstorm and find out ways to reduce. You should now be breaking down extra-large complexity tasks into small ones. I agree that you would have done that had it been easy. The trick here is to revise these tasks again with a group of capable people with you on the board. It requires deeper research. You can decide to assign two weeks to work hands on some part of this task chunk and understand the complexity more closely. That will help you break this task into small and medium chunks and keep one large complexity level. There is no correct strategy here. The key observation is that if you are dealing with too many uncertain components, you should take some time to break them down into small and easier blocks. Track your accuracy so that you can improve over time. This helps to form a feedback loop. You project. You observe how much it deviates. You use that knowledge to refine your project in the next time. We hope that you will find this software estimation technique useful. Please share your feedback if you use this software estimation technique in your next project. The credit for this technique goes to Jacob Kaplan Moss – Co-creator of Django.Bottom-Up Software Effort Estimation Technique
Break Down Work into less complex task chunks
How to Estimate Uncertainty?
Once you have quoted the expected time blocks as described in the above section, you will now apply the “if-things-go-wrong” multiplier described below:Revisit the Software Effort Estimation
Track
Browse by categories
- Agile
- Artificial Intelligence
- Automated testing
- Big Data
- Blockchain
- Business Intelligence
- Chatbots
- Cloud Computing
- Customer Experience
- Data Science
- Design Thinking
- DevOps
- Dialogflow
- Digital transformation
- EduTech
- ETL
- Healthcare
- HealthTech
- Machine Learning
- Mobile application
- Product Development
- Quality Ascent
- Quality Assurance
- Real Estate
- Software Testing
- StartUp
- Testing
You May Like This

Role Of Agile Methodology In Financial Services Industry
Financial institutions are going through multiple changes in the field of technology platforms, financial systems, risk management systems, etc. to deliver customer-oriented services in a short span of time. The industry is focusing on enhancing its operations by infusing advanced technologies to reduce financial risks, identify the latest financial trends, tracking accounts and balances, and […]

Growth Of UX Design In Banking Sector
The banking and financial services industry is adopting technologies such as UX/UI design to convert the data into an understandable form and bring transparency to the operations. Now, this industry communicates and gain customers through design as a language. UX design technology works with the aim of creating such financial services that match users’ needs […]

Testing as a Service (TaaS) In Irregular Testing Activities
The software industry is evolving at a rapid rate and requires high functionality and scalability to carry out the process with complex applications of software testing. It is rapidly adopting advanced technologies for this process and Testing as a Service (TaaS) is one of the most popular testing models that industries are adopting for achieving desired […]

How Are Machine Learning Algorithms Benefiting Data Visualization?
There is an enormous amount of data available with organizations that are gathered from multiple sources that need to be analyzed to leverage useful information from the data. The data is built into a model to predict future information with the use of technologies, such as data cleaning, data visualization, data acquisition, etc. to get […]

Generative AI – The Future of Healthcare Innovation
The healthcare landscape is entering a groundbreaking phase driven by the emergence of artificial intelligence (AI). While traditional AI is known for its significant role in analyzing existing data, Generative AI moves ahead by creating new data or content. This innovative technology also has amazing potential to transform healthcare in multiple ways. The role of […]

AI / ML based Chatbots Will Take Over Customer Service in Near Future?
An industry’s success depends on how happy and content its customers are and for that, it is necessary to invest in customer service and become an expert in this field as it’s instrumental in determining the brand value of a company in a huge way. Good experience for a client increases the chances of being […]