@@ -109,35 +109,42 @@ Here are practical examples of questions you can ask to get insights from your D
109109
110110* Track improvement or degradation in a team's delivery speed over time.*
111111
112- #### Cross-metric analysis
113-
114- ** Query:** "Which services have both high deployment frequency and low change failure rate?"
112+ ### Advanced querying techniques
115113
116- * Identify well-performing services that maintain high velocity with good quality.*
114+ <Tabs >
115+ <TabItem value =" filters " label =" Filters " >
117116
118- ** Query:** "Show me teams with MTTR above 4 hours in the last month"
117+ ** Time ranges and filtering:**
118+ - "Show me DORA metrics for services tagged as 'critical' in the last 30 days"
119+ - "What's the deployment frequency for microservices owned by the API team since January?"
120+ - "Compare MTTR between production and staging environments"
119121
120- * Identify teams that might need help with incident response processes.*
122+ </TabItem >
123+ <TabItem value =" trends " label =" Trends " >
121124
122- ### Advanced querying techniques
125+ ** Trend analysis over time:**
126+ - "How has our overall change failure rate changed over the last 6 months?"
127+ - "Show me the deployment frequency trend for the E-commerce team quarter over quarter"
128+ - "Has the Platform team's lead time improved since implementing the new CI/CD pipeline?"
123129
124- #### Filtering and time ranges
130+ </TabItem >
131+ <TabItem value =" benchmarks " label =" Benchmarks " >
125132
126- - ** "Show me DORA metrics for services tagged as 'critical' in the last 30 days"**
127- - ** "What's the deployment frequency for microservices owned by the API team since January?"**
128- - ** "Compare MTTR between production and staging environments"**
133+ ** Performance comparisons:**
134+ - "How do our DORA metrics compare to industry benchmarks?"
135+ - "Which teams are performing above/below the organization average for each DORA metric?"
136+ - "Show me services that meet the 'Elite' DORA performance criteria"
129137
130- #### Trend analysis
138+ </TabItem >
139+ <TabItem value =" cross-metrics " label =" Cross-Metrics " >
131140
132- - ** "How has our overall change failure rate changed over the last 6 months?"**
133- - ** "Show me the deployment frequency trend for the E-commerce team quarter over quarter"**
134- - ** "Has the Platform team's lead time improved since implementing the new CI/CD pipeline?"**
141+ ** Multi-metric analysis:**
142+ - "Which services have both high deployment frequency and low change failure rate?"
143+ - "Show me teams with MTTR above 4 hours in the last month"
144+ - "Identify teams that might need help with incident response processes"
135145
136- #### Benchmarking
137-
138- - ** "How do our DORA metrics compare to industry benchmarks?"**
139- - ** "Which teams are performing above/below the organization average for each DORA metric?"**
140- - ** "Show me services that meet the 'Elite' DORA performance criteria"**
146+ </TabItem >
147+ </Tabs >
141148
142149
143150## Understanding the responses
@@ -159,6 +166,8 @@ The MCP server responses rely on Port's data model, using the [available MCP too
159166- Request to create a custom web application to visualize the data (Claude Artifacts is excellent for this)
160167- Generate executive-ready dashboards and reports
161168
169+ <img src =" /img/guides/MCPDoraCustomVisualization.png " width =" 100% " border =" 1px " />
170+
162171:::tip Getting better results
163172To get the most accurate and useful responses:
164173- Be specific about time ranges (e.g., "last 30 days" instead of "recently")
0 commit comments