BI Pixie Dashboard

BI Pixie Dashboard is a Power BI app that visualizes your engagement data. After you set up the dashboard, you can use it to analyze how your BI audience interacts with your Power BI reports. The sections below describe each page in the dashboard and the key visuals it contains. For metric definitions, see BI Pixie Metrics.

Accessing the Dashboard

You can open BI Pixie Dashboard in several ways:

  • From the BI Pixie portal sidebar, click the Open Dashboard button (appears after dashboard discovery).
  • From the Overview page, use the links in the BI Pixie Dashboard card.
  • Directly in Power BI Service from the workspace where you installed the app.

Global Filters

The dashboard includes global filters that persist across pages. Use these to narrow your analysis:

  • Date range: Filter by time period. Defaults to the last year. Available on every page.
  • Capacity: Filter by Power BI capacity (SKU and capacity name). Not yet supported in BI Pixie Cloud.
  • Workspace: Filter by workspace when tracking reports across multiple workspaces.
  • Report: Focus on specific reports.
  • Page: Drill into individual report pages.
  • Platform: Filter by access platform (web, desktop, mobile, embedded).
  • User: Drill into individual user behavior.

Overview

The landing page of the dashboard. It provides a high-level summary of your Power BI portfolio with key metrics, trends, and a central table for quick comparison across workspaces and reports.

Key visuals:

  • Report metrics cards: Instrumented Reports, Opened Reports, Interactive Reports, Passive Reports, Unused Reports, % Passive and % Unused.
  • Engagement Funnel: Shows how user activity narrows at each stage of engagement, from broad Report Views down to specific Link Clicks: Report Views → Page Views → Visual Interactions → Slicer Clicks → Bookmark Clicks → Tooltip Views → Drillthroughs → Link Clicks.
  • Interactive & Passive Sessions over time: Tracks session trends with cumulative user count overlay.
  • Engaged & Passive Users: Column chart comparing active vs. passive user counts.
  • CSAT (Customer Satisfaction) gauge: Satisfaction score from feedback controls (last response per user).
  • Feedback Clicks: Breakdown of positive, neutral, and negative feedback.
  • Key Survey Results: Financial gains and time savings from embedded surveys.
  • Metrics by Workspace and Report: A pivot table with Users, DAU (Daily Active Users), MAU (Monthly Active Users), Passive Users, CSAT, % Unused Pages, % Passive Sessions, Avg Interactions, and Avg Duration per report.

User Adoption

Tracks how many users are actively using your reports over time. Distinguishes between engaged users (who interact with visuals, slicers, etc.) and passive users (who only view pages).

Key visuals:

  • Metric cards: Total Users, DAU, MAU, DAU/MAU ratio, Engaged Users, Passive Users.
  • User Adoption Metrics over time: Line chart showing DAU, DAU excluding Passive, MAU, and MAU excluding Passive across months.
  • User Stickiness (DAU/MAU): Line chart tracking the DAU/MAU ratio — a standard SaaS engagement metric. Higher ratio means users come back more frequently.
  • MAU by Time Intelligence: A pivot table showing MAU (excluding passive users) broken down by time calculation (Current, MTD, MOM%, YOY%, etc.).
  • Users by Report: Bar chart ranking reports by user count.

User Attrition

Identifies users who have stopped engaging with your reports. Helps you proactively reach out to disengaged users. The attrition calculation uses a configurable inactivity window (default: ignore last 2 months).

Key visuals:

  • Metric cards: Total Users, DAU, MAU, DAU/MAU, Engaged Users, Passive Users, % Passive Users.
  • User Attrition Metrics over time: Combined chart showing Users, New Users, Last-Time Users (columns) with User Attrition % (line). Last-time users are those whose final activity falls in that month.
  • Inactive Users table: Lists each user with their days of inactivity since last session.
  • Users by Report: Bar chart showing user counts per report with CSAT tooltip.
  • Ignore last N months: A slicer to exclude recent months from the attrition calculation, since those users may still return.

User Engagement Analysis

Detailed analysis of how users interact with your reports. Provides distribution histograms, engagement tables, and sub-pages for session-level and low-usage analysis.

Key visuals:

  • Report metrics cards: Instrumented, Opened, Interactive, Passive, Unused Reports with percentages.
  • Engaged & Passive Users over time: Column chart tracking engaged vs. passive user trends.
  • Distribution of Avg Interactions by User: Histogram showing how users are distributed by their average interaction count per session.
  • Distribution of Session Duration by User: Histogram showing how users are distributed by their average session length.
  • Metrics by Report: Table with Users, MAU, MAU (Exc. Passive), % Unused Pages, % Passive Users, Avg Interactions, Total Interactions, and Avg Duration per report.
  • Metrics by Page: Same metrics at the page level, including % Interactive Sessions and Avg Duration in Page.

This page has two tabbed sub-pages (additional pages grouped under the main page's tab in the dashboard — use the tab bar at the top to switch between them):

  • Session Engagement Analysis: Dives deeper into session-level metrics. Shows distribution of interactions and duration per session (not per user), a sessions table by report and by page, and tracks Interactive vs. Passive Sessions over time.
  • Low-Usage Analysis: Identifies underperforming assets — Unused Reports, Unused Pages in Active Reports, Least-Active Reports and Pages (below 100 sessions), Least-Active Users, and Inactive Users with days of inactivity.

User Satisfaction

Analyzes feedback collected through the feedback controls (thumbs up/down, smile/frown). Shows satisfaction trends by report, page, and user. Requires feedback collection to be enabled (Pro plan).

Key visuals:

  • CSAT gauge: Overall customer satisfaction score. The CSAT selector lets you switch between "Last Response" (one vote per user-report) and "All Responses" (every click counted).
  • Feedback Clicks by icon: Column chart breaking down clicks by feedback type (positive, neutral, negative).
  • Feedback Clicks by Button: Shows which feedback button style was used (thumbs, smile, etc.).
  • CSAT & Feedback Clicks over time: Combined chart tracking feedback volume and CSAT trend.
  • CSAT by Report: Horizontal bar chart ranking reports by satisfaction score.
  • CSAT by User: Horizontal bar chart showing per-user satisfaction.
  • CSAT & MAU by Report: Scatter plot correlating satisfaction with monthly active users.
  • CSAT & Interactions by Report: Scatter plot correlating satisfaction with average interactions per session.

Survey Results

Displays NPS (Net Promoter Score), business value estimates, and time-saved responses from embedded surveys (Pro plan). A "Survey Calc Type" slicer lets you choose between Latest Response (one answer per user-report) and All Responses (every submission counted).

Key visuals:

  • Metric cards: Respondents (N), NPS, Promoters, Detractors, Time Saving (Hours), Total Gains, and Avg Gains.
  • NPS gauge: NPS score on a -100 to +100 scale.
  • NPS over time: Combined chart with Promoters and Detractors (columns) and NPS (line).
  • Responses by Rating: Column chart showing the distribution of NPS ratings (0-10).
  • Time Savings & Financial Gains over time: Column charts tracking the cumulative business impact.
  • Respondents by Financial Gain / Time Saving: Bar charts showing how many respondents selected each answer value.
  • Survey Results by Report: Table with N (respondents), NPS, Detractors, Promoters, Time Savings, Financial Gains, and Gains per Respondent.

This page has two tabbed sub-pages:

  • Responses Over Time: Tracks survey responses over time by question, with a question selector slicer. Includes a line chart showing answer distribution, column chart of response counts by answer, and a table of results by report. Also includes an answer slicer to filter to specific responses.
  • Answers by Respondent: Lists individual respondent data (report, date, user, answers, financial gains, time savings, NPS rating) and shows bar charts of respondent distribution by Issues, Time Saving, Decision Impact, and Financial Gain categories.

Heatmap

Visual-level engagement analysis showing which visuals receive the most clicks. Displays a heatmap overlay on your report page layout. See Use Heatmap for setup instructions.

Key visuals:

  • Heatmap grid: A matrix visual that overlays click density on your report page layout. Each cell is color-coded by click count. Hover over a cell to see the visual type, title, and click count in a tooltip.
  • Clicks by Page: Table ranking pages by click count, with direct links to open the page in Power BI.
  • Clicks by Visual: Table listing each visual with its ID, type, built-in status, title, and click count.
  • Clicks by Column and Selected Value: Shows what users selected within each visual (slicer values, filter selections, bookmarks, links).
  • Clicks by Event Type: Bar chart breaking down clicks by interaction type (VizClick, Slicer, Bookmark, Link).
  • Clicks by Visual Type: Bar chart showing which visual types receive the most clicks.

Custom Visuals Heatmap

Extends the heatmap view with a focus on custom visual security and trust. Shows which custom visuals are used in your reports, their certification status (Certified, Not Certified, Unlisted, Unknown), and maps them on the page layout with trust-level color coding.

Key visuals:

  • Custom Visuals Heatmap grid: A matrix overlaying visual trust levels on your page layout. Colors indicate certification status.
  • Visuals by State: Table listing visual states (Certified, Not Certified, Unlisted, Unknown) with trust levels and visual counts.
  • Count of Custom Visuals by Type: Table showing visual type, certification state, publisher, visual count, and click count.
  • Pages with Custom Visuals: Table listing pages with counts of Certified, Uncertified, and Risky visuals.
  • Clicks by Visual: Detail table listing each visual with its type, trust state, ID, and title.

Data Auditing

Audits data access patterns by tracking what data values users select through slicers, filters, and visual interactions. Helps identify potential data governance concerns and understand which data dimensions and values are most frequently accessed.

Key visuals:

  • Data Selections Over Time: Line chart tracking the volume of data selection events.
  • Decomposition Tree: Interactive drill-down from Filtered Table → Filtered Column → Filtered Values, showing how many times each data path was accessed.
  • Word Cloud: Visual summary of the most frequently selected data values.
  • Data Selections by User: Tables showing which users accessed which data values, on which reports and pages.
  • Slicers: Filter by Filtered Table, Filtered Column, Filtered Values, and Distinct Count to narrow the audit.

RLS Auditing

Tracks Row-Level Security (RLS) context for users interacting with your reports. Helps detect RLS misconfigurations, overexposed data, and potential security risks. Requires RLS auditing to be enabled in Settings.

Key visuals:

  • Metric cards: Users with RLS, Users with No Single Role, Over-blocked Users, Users with Suspicious Changes, Suspected Overexposed Rows, and RLS Changes.
  • RLS Alert Metrics by Report: Table showing per-report breakdown of users with single roles, unmatched/multiple roles, over-blocked users, suspicious changes, and overexposed rows.
  • RLS Alert Metrics by User: Same metrics at the user level.
  • RLS Alert Metrics by Date: Combined chart tracking Views with No Single Role, Over-Blocking Views, and Suspected Overexposed Rows over time.
  • Overexposed Data: Table showing RLS role, secured table, suspected overexposed rows, expected row count, and actual loaded rows.
  • Maximal Distinct Count of Filtered Values: Table showing the maximum number of distinct values each user sees through each RLS role and column.
  • Distinct Count by Table: 100% stacked bar chart showing how filtered values are distributed across RLS roles and tables.

This page has one tabbed sub-page:

  • RLS Suspicious Changes: Focuses on detecting suspicious changes in RLS-filtered data. Shows a decomposition tree for drilling into changes by Report → RLS Role → User → Secured Table → Column → Filtered Values, plus bar charts of RLS Changes by User and by Report, a timeline of changes, and tables showing actual rows loaded and accessible data by filter value.

Design Impact

Analyzes how report design characteristics affect user engagement. Uses three calculated scores per report — Complexity Score (based on visual count and layout density), Usability Score (based on interactive controls like bookmarks, slicers, tooltips, and drill-throughs), and Passivity Score (based on the ratio of table visuals and columns to total visuals) — and correlates them with business outcome metrics (MAU, Interactive Sessions, CSAT, NPS, Financial Gains, Time Savings).

Key visuals:

  • Scatter plots: Plot reports by a design attribute (X axis) against a business outcome (Y axis), with bubble size indicating user count. Use the two slicers to choose which design attribute and which business outcome metric to compare.
  • Design Metrics by Report: Table listing each report with Users, MAU, Complexity/Usability/Passivity scores, Pages, Visuals per Page, Tables/Matrices, Columns in Tables, Bookmarks, Slicers, Tooltip Pages, Drillthrough Pages, and non-analytic objects (buttons/shapes/images).

This page has three tabbed sub-pages:

  • Design Impact by Page: Same scatter plots and design metrics but at the page level instead of report level.
  • Design Changes: Tracks how design metrics change over time. Shows a line chart correlating a design metric (e.g., Visuals per Page) with a business outcome (e.g., MAU) over time. Includes a detailed table of design metrics by page.
  • Design Impact Help: A text page explaining how to interpret the design scores.

Position of Pages

Analyzes how the position (index) of a page within a report affects engagement. Helps you decide which pages to place first and whether reordering could improve usage.

Key visuals:

  • Active & Passive Users by Page Position: Column chart showing engaged vs. passive users at each page index.
  • Avg Interactions & Duration per Session by Position: Combined chart showing average interactions (columns) and average time spent (line) at each page position.
  • Activity Metrics by Position: 100% stacked column chart breaking down all activity types (Report Views, Page Views, Visual Interactions, Slicer Clicks, Bookmark Clicks, Tooltip Views, Drillthroughs, Link Clicks) by page position.
  • Pages by Interactions & Duration: Scatter plot showing each page positioned by its average page index (X), average interactions (Y), with bubble size indicating time spent.

Activity by Report

Detailed activity logs showing individual event types broken down by different dimensions. An activity metrics slicer lets you choose which activities to display as a dynamic field parameter.

Key visuals:

  • Metric cards: Page Views, Visual Interactions, Slicer Clicks, Bookmark Clicks, Drillthroughs, Link Clicks.
  • Page Activities Over Time: Combined chart showing each activity type as a separate series.
  • Tracked Activities by Report: 100% stacked bar chart showing the activity mix per report.
  • Sessions by Report: Table with Sessions, Interactions per Session, % Interactive Sessions, Passive Sessions, and CSAT per report.

This page has three tabbed sub-pages:

  • Activity by User: Same activity breakdown but grouped by user. Includes a users table with Sessions, Interactions per Session, % Interactive Sessions, and CSAT.
  • Activity by Page: Same activity breakdown but grouped by page. Includes a page sessions table with Avg Interactions, % Interactive Sessions, and Passive Sessions.
  • Single User Activity: Deep dive into a single user's event timeline. Shows a word cloud of their most-accessed data values, an event log table (timestamp, report, page, event type, platform, visual details, bookmarks, links, filters, IP), and tracked activities over time.

Metrics by Report

A comprehensive data export page with all metrics combined into a single table. Designed for detailed comparison and export. Includes field parameter slicers for User Adoption Metrics, Report Session Metrics, Activity Metrics, and a Time Intelligence slicer (Current, MTD, MOM%, YOY%, running averages, etc.).

Key visuals:

  • Master table: A comprehensive table with columns for Project, Capacity, Workspace, Report, CSAT (Last Response and All Responses), NPS, all User Adoption Metrics (Users, Engaged, Passive, DAU, MAU, WAU and their ratios), all Report Session Metrics (Sessions, Interactive/Passive, Duration, Interactions, Total Hours), and all Activity Metrics (Page Views, Visual Interactions, Slicer Clicks, Bookmarks, Tooltips, Drillthroughs, Link Clicks).

This page has two tabbed sub-pages:

  • Metrics by User: The same comprehensive metric table, but with an additional User column, showing all metrics per user per report. Includes separate tables for report-level and page-level engagement, with page session metric slicers.
  • All Metrics: A reference page showing all available metrics organized by category (Report Metrics, Report Sessions, Adoption Metrics including/excluding Passive, Page Metrics, Page Sessions, Tracked Activities, User Satisfaction, Survey Analysis, Design Scores) with a Time Intelligence slicer.

Events

A raw event log showing individual telemetry records. Use this page for debugging, auditing, or understanding specific user interactions at the lowest level of detail.

Key visuals:

  • Recent events table: Shows each event with timestamp, project, workspace, report, page, event type, page index, username, platform, visual name, visual type, bookmark, feedback, link, filtered column, filtered values, and client IP.
  • Event Type slicer: Filter events by type (VizClick, Slicer, Bookmark, Link, Drillthru, etc.).

Related Resources

  • Settings — Configure tracking options, feedback controls, and surveys that feed into the dashboard.
  • Set up Dashboard — Install and connect the dashboard to your data.
  • BI Pixie Semantic Model — Understand the underlying data model for building custom reports.

What's Next