The newly announced Jellyfish Benchmarks enable engineering leaders to add context to engineering metrics and performance by introducing a method for comparisons. 

Jellyfish customers now have insight at the percentile level as to how they stack up against their peers since the complex nature of modern software engineering often fails to give engineers proper context, according to the company. 

Engineers who opt-in will have their data anonymized and added to the benchmarking Jellyfish customer pool. Among the key metrics are allocation, delivery, productivity, and collaboration. 

Allocation benchmarks enable teams to understand how their resource investments compare to other engineering organizations. Delivery can be benchmarked using metrics such as cycle time, productivity using metrics like issues resolved, and collaboration using metrics such as PR reviews.

New benchmarking capabilities also enable DevOps teams to track against DORA metrics including deployment frequency, lead time, mean time to recovery, and change failure rate.

“By offering the industry’s only in-app benchmarking, Jellyfish is empowering engineering leaders to use context-driven data to examine the efficacy of their engineering strategy and their team’s operations,” said Krishna Kannan, head of product at Jellyfish. “Jellyfish Benchmarks is available for every metric tracked across teams or the entire organization, helping customers understand how their company compares for the metrics that matter most to their engineering function.”

Jellyfish Benchmarks can also provide data-driven insights to inform non-engineering executives to help drive more strategic decisions.