Overview
The Contributors view is your team’s individual performance lens inside Optimal AI Insights.
It turns raw GitHub activity into a clear, contextualized report about how each engineer is contributing to your team’s velocity, collaboration, and delivery quality.
Each contributor’s page tells a complete story: how often they code, how fast their PRs move through review, how they collaborate across teams, and how their efficiency trends over time.
It’s designed for engineering managers, tech leads, and team leads who want to understand not just “how much” someone is working but how their workflow affects the team’s rhythm and throughput.
In practice, this page functions as an auto-generated weekly report for each contributor, powered by live GitHub data and Optimal AI’s AI Insight engine.
What You’ll See
1. Contributor Summary
At the top, you’ll see an at-a-glance view of the engineer, their name, role, and associated location along with a date range selector.
Changing the range recalculates every metric, so you can zoom into a sprint or expand to a quarter for trend analysis.

The dropdown makes it easy to switch between individuals, enabling quick comparisons or 1:1 preparation between multiple engineers on the same team.

2. Highlights
The Highlights panel summarizes what changed during the selected period.
It tracks short-term improvements and areas that might need attention, for example:
-
Faster turnaround on PRs (reduced cycle time)
-
Increase or dip in coding days
-
Efficiency changes over time
-
Review speed or collaboration shifts
Each highlight is automatically contextualized: it compares the current week or month to the previous period and frames the result in terms of impact.
This gives managers a high-signal summary that can be used directly in weekly check-ins or retrospectives.

3. Efficiency Score
The Efficiency Score is Optimal AI’s composite productivity indicator.
It reflects how balanced an engineer’s activity is across the core developer workflow from coding, opening PRs, and reviewing others’ work.
Rather than focusing on volume alone, it looks at the mix of contribution types to highlight healthy, well-rounded patterns.
For example, an engineer who both pushes code and consistently reviews others’ PRs will naturally have a stronger efficiency balance than someone focused on one dimension only.
This score is normalized within each team, so contributors can see how their current patterns compare to the broader group trend.

4. Team Performance
The Team Performance panel benchmarks each contributor against their team or functional group.
It shows ranking or percentile indicators (e.g., “Top Engineer in Engineering Core based on Cycle Time”), giving quick visibility into top performers or areas needing support.
This benchmark helps leaders identify coaching opportunities and distribute review or merge load more fairly.
Knowing where each person’s work cadence sits relative to peers with similar roles or repos.

5. AI Insight Integration
Every contributor page includes a ⚡ AI Insight section that can automatically generate an analysis powered by Optimal AI’s Insight engine.
It goes beyond metrics to deliver:
-
TL;DR summaries that explain what changed and why.
-
Actionable insights such as “Encourage diverse collaboration” or “Monitor review load.”
-
Trend explanations connecting activity data (like smaller PRs or faster review times) to likely workflow improvements.
-
Summary analyses that quantify shifts in coding days, PR size, and review timing.
Instead of manually analyzing trends, you get AI-curated takeaways that help you lead more effective retros, reviews, and 1:1s.
The AI Insight layer turns the Contributor page into a live performance narrative blending quantitative data with qualitative interpretation.

Metrics & Trends
Every metric on this page comes directly from GitHub activity, normalized by the date range you select.
They’re designed to give you visibility into not only what’s happening, but how it’s evolving over time.
Core Metrics
-
Average PR Cycle Time — how long it takes from opening to merging a pull request.
Useful for spotting bottlenecks or reviewing efficiency gains. -
Coding Days — number of unique days with commits in the selected period.
Indicates coding consistency and engagement across weeks. -
Total Lines Added / Deleted — code churn measure showing how much work has been written, refactored, or cleaned up.
-
Time to First Review — average time before a contributor’s PR gets its first review.
A shorter time usually signals good review responsiveness. -
PRs Reviewed / PRs Opened / PRs Assigned — counts of code review engagement and ownership spread.

Trend Visualization
Below the metrics, a time-series chart visualizes changes across days or weeks.
You can switch the metric being visualized from average PR size, PR cycle time, commits, or time-to-first-review to see where patterns emerge.
This view helps identify:
-
Days with high merge or commit activity
-
Times when review load spikes
-
Patterns of improvement after process changes or team reorganizations

Activity Timeline
The Activity chart visualizes the contributor’s daily work in bubble from each event (commit, PR, review, merge, or comment) represented by color-coded circles.
This gives you a literal picture of when and how consistently a contributor is active throughout the week or month.
It’s especially useful for spotting review backlogs or uneven work distribution during a sprint.
Activity types are color-coded to distinguish between coding, reviewing, and commenting actions, giving you both temporal and contextual clarity.

Commit Contribution & Review Collaboration
These two sections surface deeper patterns in how engineers interact across the organization.
-
Commit Contribution maps monthly commit volume, highlighting consistency or variability over time.
It’s a quick way to see who’s actively coding during certain release cycles. -
Review Collaboration lists who reviews each contributor’s PRs most often.
This reveals team dependencies, mentorship relationships, and potential review bottlenecks.
A healthy collaboration map shows diverse review activity across several teammates, not just one or two reviewers.

Together, these charts transform raw GitHub relationships into actionable team insight.
Why It Matters
Engineering performance isn’t about counting commits — it’s about context.
The Contributors page provides that context in real time. It connects quantitative metrics (like PR cycle time) to qualitative patterns (like collaboration and review balance).
For managers, it means better 1:1s, clearer performance discussions, and faster recognition of improvement.
For engineers, it means visibility into their own workflow — what’s improving, where bottlenecks are forming, and how their work stacks up within the team.
Optimal AI’s Contributors view bridges the gap between individual productivity and team-level velocity, using AI-generated insights to keep both aligned.
ProTips
-
Use this page in weekly 1:1s or retros to frame discussions around measurable progress.
-
Pair it with PR Cycle Time to understand how individual performance affects overall velocity.
-
Watch for review concentration — if one contributor reviews most PRs, consider load-balancing.
-
Encourage contributors to reflect on their AI Insight summaries as part of self-assessment.
-
Combine this view with Allocations or Distributions to connect effort with engineering investment types (e.g., feature vs. tech debt).
Troubleshooting
-
No data shown: Confirm GitHub sync is active and the contributor has commits or PRs within the selected time range.
-
Efficiency score missing: Ensure the contributor has both code and review activity; the score needs both inputs.
-
Team comparison unavailable: Check if the contributor is assigned to a defined team in GitHub or Optimal AI.
-
Activity chart flat: Expand to a larger time window or confirm event filters (commit/review/comment) are enabled.
In Summary
The Contributors page is your team’s heartbeat at the individual level.
It combines precision metrics, AI interpretation, and historical context so you can see not just what engineers did, but how their actions shape team outcomes.