60 Practice Questions & Answers
You are implementing a data lakehouse in Microsoft Fabric. Which component serves as the primary storage layer for both structured and unstructured data?
-
A
Power BI Dataset
-
B
Datamart
-
C
Warehouse
-
D
OneLake
✓ Correct
Explanation
OneLake is Microsoft Fabric's unified data lake that provides centralized storage for all organizational data in both structured and unstructured formats, eliminating data silos.
When designing a medallion architecture in a Fabric lakehouse, what is the primary purpose of the Bronze layer?
-
A
To store raw, ingested data in its original format with minimal transformation
✓ Correct
-
B
To serve as the source of truth for all analytics queries
-
C
To aggregate data for executive dashboards and reporting
-
D
To contain cleaned and deduplicated data ready for analytics
Explanation
The Bronze layer in the medallion architecture acts as a landing zone for raw data ingestion, preserving the original data format to enable data lineage and recovery if needed.
You need to create a real-time analytics solution in Microsoft Fabric. Which two components must you configure together to enable continuous data ingestion and processing?
-
A
Lakehouse and Paginated Reports
-
B
Dataflow Gen2 and Power BI Report
-
C
Warehouse and Azure Data Factory
-
D
Eventstreams and KQL Database
✓ Correct
Explanation
Eventstreams in Fabric capture real-time event data and can stream it directly to KQL (Kusto Query Language) Databases for immediate analytics and time-series analysis.
Which T-SQL feature allows you to query Delta tables in a Microsoft Fabric Warehouse without converting them to Parquet format?
-
A
Cross-lake query federation
-
B
Delta format compatibility layer
-
C
Warehouse external table syntax
-
D
Shared metadata across OneLake
✓ Correct
Explanation
Fabric's shared metadata across OneLake enables Warehouse to query Delta tables natively from lakehouses through unified SQL endpoint access without format conversion.
You are configuring a Dataflow Gen2 that pulls data from multiple cloud sources. Which transformation feature allows you to combine data from different sources into a single table?
-
A
Join operation
-
B
Merge queries
-
C
Append queries
✓ Correct
-
D
Union transformation
Explanation
Append queries in Dataflow Gen2 combine multiple tables vertically, stacking rows from different sources with matching column structures into a single output table.
When implementing row-level security (RLS) in Microsoft Fabric, at which layer should you primarily apply dynamic filters for a warehouse-based analytics solution?
-
A
In the Dataflow Gen2 transformation logic
-
B
In the Power BI semantic model using DAX formulas
✓ Correct
-
C
In the Warehouse table definition using security policies
-
D
In OneLake access control lists
Explanation
While Warehouse supports RLS, Power BI's semantic model is the recommended layer for RLS implementation because it provides granular dynamic filtering based on user context and is centrally managed.
You need to establish a direct connection from Microsoft Fabric to an on-premises SQL Server database. Which gateway type must be deployed in your network?
-
A
Cloud gateway with hybrid connector
-
B
On-premises data gateway in standard mode
✓ Correct
-
C
Personal gateway configured for enterprise use
-
D
Standard gateway only
Explanation
An on-premises data gateway installed in standard mode (not personal) is required to securely connect Fabric to on-premises data sources like SQL Server from the cloud.
In a Microsoft Fabric implementation, what is the primary advantage of using shortcuts in a lakehouse instead of copying data?
-
A
Shortcuts provide automatic data encryption at rest and in transit
-
B
Shortcuts automatically optimize query performance by 50 percent
-
C
Shortcuts enable direct modification of external data without affecting the source
-
D
Shortcuts eliminate data duplication and maintain a single source of truth while providing unified access
✓ Correct
Explanation
Shortcuts create logical references to data stored elsewhere (within OneLake or external sources) without duplicating storage, enabling unified access and reducing storage costs while maintaining data lineage.
You are designing the refresh schedule for a Dataflow Gen2 that depends on external API data. What is the recommended approach to handle potential API throttling or timeout issues?
-
A
Configure the dataflow to fail silently and skip failed rows
-
B
Implement exponential backoff retry logic and schedule refreshes during off-peak hours
✓ Correct
-
C
Increase the refresh frequency to catch all data updates immediately
-
D
Bypass error handling to prioritize throughput over reliability
Explanation
Implementing exponential backoff and scheduling refreshes during off-peak hours reduces API throttling, improves reliability, and ensures consistent data delivery without overwhelming external systems.
When creating a semantic model in Microsoft Fabric, which table role should be used to store historical snapshots of dimension attributes?
-
A
Slowly Changing Dimension Type 0
-
B
Slowly Changing Dimension Type 2
✓ Correct
-
C
Bridge table with temporal validity
-
D
Fact table with snapshot grain
Explanation
Slowly Changing Dimension Type 2 preserves complete history by creating new rows with effective date ranges when attributes change, allowing accurate historical analysis.
You need to monitor the performance of a lakehouse in Microsoft Fabric. Which built-in tool provides real-time insights into table statistics and query execution metrics?
-
A
Lakehouse Explorer statistics pane
-
B
SQL Server Management Studio Extended Events
-
C
Fabric Capacity Metrics application
✓ Correct
-
D
Performance Analyzer in Power BI
Explanation
The Fabric Capacity Metrics application provides comprehensive real-time monitoring of workspace and item performance, including query metrics and capacity utilization across all Fabric items.
When implementing incremental refresh in a Dataflow Gen2, which parameter type should you use to automatically filter data based on the last refresh timestamp?
-
A
Manual parameter requiring user input before each refresh
-
B
Query-based parameter referencing external timestamp table
-
C
Static parameter with hardcoded date value
-
D
Dynamic parameter using the RefreshTime system variable
✓ Correct
Explanation
The RefreshTime system variable in Dataflow Gen2 automatically captures the last successful refresh timestamp, enabling dynamic incremental load filtering without manual parameter updates.
You are optimizing a slow-running query in a Fabric Warehouse. Which indexing strategy is most effective for queries with multiple WHERE clause predicates?
-
A
Filtered index covering only low-selectivity columns
-
B
Full-text index on text columns
-
C
Clustered index on the primary key only
-
D
Composite nonclustered index including all filtered columns
✓ Correct
Explanation
Composite nonclustered indexes including all filtered columns enable the query optimizer to use index seeks instead of table scans, significantly improving performance for multi-predicate WHERE clauses.
Which Microsoft Fabric component is best suited for building embedded analytics experiences within custom applications?
-
A
Eventstreams for real-time event capture
-
B
Power BI Premium embedded with service principal authentication
✓ Correct
-
C
KQL Database for dashboard visualization
-
D
Datamart for operational reporting only
Explanation
Power BI Premium embedded with service principal authentication allows developers to embed Power BI reports and dashboards within custom applications with granular access control.
When configuring a KQL Database in Microsoft Fabric, which ingestion method provides the lowest latency for streaming sensor data?
-
A
Batch import from CSV files
-
B
Scheduled Dataflow Gen2 refresh
-
C
Eventstream connection with dynamic ingestion mapping
✓ Correct
-
D
T-SQL BULK INSERT from Warehouse
Explanation
Eventstream connections to KQL Databases provide near real-time ingestion with sub-second latency and automatic schema mapping, ideal for continuous sensor data streaming.
You need to implement column-level encryption for sensitive customer data in a Microsoft Fabric Warehouse. Which T-SQL feature should you use?
-
A
Transparent Data Encryption (TDE)
-
B
Always Encrypted with column encryption keys
✓ Correct
-
C
Database-level encryption with service-managed keys
-
D
Row-level security policies with encrypted values
Explanation
Always Encrypted provides column-level encryption where data is encrypted on the client side before transmission, ensuring sensitive columns remain encrypted in the Warehouse and during queries.
In a Microsoft Fabric implementation, what is the primary role of a Mirrored Database?
-
A
To create backup copies of semantic models for disaster recovery
-
B
To store historical audit logs for compliance purposes
-
C
To cache frequently accessed Power BI datasets
-
D
To replicate data from external sources into Fabric with minimal latency while maintaining automatic synchronization
✓ Correct
Explanation
Mirrored Databases automatically replicate data from external sources like Dynamics 365 or Salesforce into Fabric with continuous synchronization, eliminating manual ETL for these platforms.
You are designing a star schema in Microsoft Fabric for a retail analytics solution. How should you handle product dimension records that change infrequently but may have attribute updates?
-
A
Use a fact table with embedded product attributes for denormalization
-
B
Create a Type 2 slowly changing dimension with effective date ranges and a surrogate key in the fact table
✓ Correct
-
C
Maintain separate version tables and join them at query time with LEFT OUTER JOIN
-
D
Store only the latest product attributes in a single static dimension table
Explanation
Type 2 slowly changing dimensions with effective dates and surrogate keys enable accurate historical analysis while maintaining referential integrity and clean fact table design.
When sharing a Fabric lakehouse across multiple teams, which workspace permission level allows team members to read data but prevents them from modifying table schemas?
-
A
Viewer at the workspace level
✓ Correct
-
B
Contributor with limited scope
-
C
Admin with restricted editing rights
-
D
Member with schema lock enabled
Explanation
The Viewer permission at the workspace level grants read-only access to published items including lakehouses, preventing any modifications to data, schemas, or item definitions.
You need to validate data quality in a Dataflow Gen2 before loading it into a lakehouse. Which transformation feature allows you to define custom validation rules and flag problematic rows?
-
A
Pivot transformation with validation aggregate
-
B
Error handling with row rejection criteria
✓ Correct
-
C
Data profiling with automatic anomaly detection
-
D
Conditional column with custom expressions
Explanation
Error handling in Dataflow Gen2 allows you to define conditions that identify invalid rows, redirect them to separate error tables, and continue processing valid data without interruption.
In Microsoft Fabric, which feature enables you to define business logic and calculations once and reuse them across multiple Power BI reports and Warehouses?
-
A
Python scripts in Dataflow Gen2
-
B
DAX queries in Power BI Desktop
-
C
T-SQL stored procedures with parameterization
-
D
Calculation groups in semantic models
✓ Correct
Explanation
Calculation groups in semantic models define standardized measures and time-intelligence calculations once, making them reusable across all Power BI reports connected to that semantic model.
You are implementing a data retention policy for a Fabric lakehouse that stores transaction history. What is the recommended approach to archive old data while maintaining query performance?
-
A
Implement table partitioning by date and archive older partitions to external cold storage using shortcuts
✓ Correct
-
B
Keep all data in the primary lakehouse and use WHERE clauses to filter historical queries
-
C
Delete rows older than the retention period during daily maintenance windows
-
D
Move data to a separate archive lakehouse and remove it from the main lakehouse entirely
Explanation
Table partitioning by date with archival to cold storage reduces active table size for better performance while preserving historical data accessibility through shortcuts without data duplication.
When configuring a Dataflow Gen2 that ingests data from a REST API with pagination, which approach ensures you capture all available records?
-
A
Use a Web.Contents function with pagination parameters to fetch all pages in a single loop
✓ Correct
-
B
Configure the API source with automatic pagination detection enabled
-
C
Set a fixed row limit and execute multiple dataflows sequentially
-
D
Implement a custom Python script within the dataflow to handle pagination logic
Explanation
Using Web.Contents with pagination parameters (typically using a loop in Power Query) allows you to fetch all API pages in a single dataflow execution without manual intervention.
In a Microsoft Fabric implementation, which monitoring alert should you configure to proactively identify when a Warehouse capacity is approaching throttling limits?
-
A
OneLake storage reaches maximum allocated size
-
B
Workspace utilization exceeds 80 percent of total capacity
✓ Correct
-
C
Query execution time exceeds 5 minutes average
-
D
Refresh failures occur for consecutive scheduled runs
Explanation
Configuring alerts when workspace utilization exceeds 80% of total capacity enables proactive capacity management and prevents throttling, which degrades query performance across all Fabric items.
You need to implement federated identity for a Microsoft Fabric solution serving international users. Which authentication method provides the best balance of security and user experience across regions?
-
A
OAuth 2.0 with hardcoded regional service principals
-
B
Azure AD with multi-region tenant configuration and conditional access policies
✓ Correct
-
C
SAML 2.0 with separate identity providers per region
-
D
Basic authentication with regional password policies
Explanation
Azure AD with multi-region tenant configuration provides centralized identity management while conditional access policies enforce security across all regions without requiring separate authentication systems.
You are implementing an analytics solution in Microsoft Fabric. You need to ingest data from multiple sources including Azure SQL Database, Dynamics 365, and CSV files. Which Fabric component should you primarily use to orchestrate these data ingestion tasks?
-
A
SQL Analytics Endpoint
-
B
Data Factory pipelines
✓ Correct
-
C
Dataflow Gen2
-
D
Power BI Desktop
Explanation
Data Factory pipelines in Fabric are designed to orchestrate and automate data ingestion from multiple heterogeneous sources including databases, SaaS applications, and file storage.
Which of the following accurately describes the relationship between a Lakehouse and a Warehouse in Microsoft Fabric?
-
A
Both Lakehouse and Warehouse store data in Delta format, but Lakehouse supports schema-on-read while Warehouse enforces schema-on-write
✓ Correct
-
B
Warehouse is a legacy component being deprecated in favor of Lakehouse
-
C
Lakehouse and Warehouse are identical components with different names in different regions
-
D
Lakehouse is designed for unstructured data only, while Warehouse is for structured data only
Explanation
Lakehouse and Warehouse are complementary components in Fabric; Lakehouse provides flexibility with schema-on-read for diverse data, while Warehouse enforces structured schemas for relational analytics.
You have created a Dataflow Gen2 that transforms customer data. You need to refresh this dataflow automatically every morning at 2 AM. What should you configure?
-
A
Scheduled refresh in the Dataflow settings
✓ Correct
-
B
A Data Factory pipeline trigger
-
C
Power BI refresh schedule in the dataset settings
-
D
A time-based automation rule in the workspace
Explanation
Dataflow Gen2 supports built-in scheduled refresh settings that allow you to configure automatic refresh times without requiring external orchestration.
Your organization needs to implement row-level security (RLS) on a semantic model used by multiple departments. Each department should only see their own sales data. Where should you implement this RLS?
-
A
In the Data Factory pipeline using conditional transformations
-
B
In the SQL Analytics Endpoint using T-SQL row filters
-
C
In the Lakehouse using Delta Lake table properties
-
D
In the Power BI semantic model using DAX roles and filters
✓ Correct
Explanation
Row-level security in Fabric semantic models is implemented using DAX roles with filter expressions that dynamically restrict data based on user context.
You are designing a Fabric workspace architecture for a large enterprise with multiple teams. What is the recommended approach for organizing workspaces?
-
A
Create separate workspaces per department or business unit, implementing shared semantic models for common dimensions
✓ Correct
-
B
Create workspaces randomly and redistribute content quarterly
-
C
Create one workspace per individual user to maximize isolation and security
-
D
Create a single workspace for the entire organization to ensure data consistency
Explanation
Best practice is to organize workspaces by business function or department while using shared semantic models to promote reusability and consistency across teams.
Which notebook environment in Fabric is primarily used for data science and machine learning workloads?
-
A
T-SQL Notebook for querying only
-
B
Excel Notebook for collaborative analysis
-
C
Synapse Notebooks with PySpark and R support
✓ Correct
-
D
Power Query Notebook for data transformation only
Explanation
Synapse Notebooks in Fabric support PySpark, Spark SQL, and R, making them ideal for data science, machine learning, and exploratory data analysis.
You need to optimize a Lakehouse query that is performing slowly on aggregated data. Which Fabric feature should you implement to improve query performance without changing the underlying data structure?
-
A
Archive old data to a separate storage account
-
B
Create materialized views or aggregation tables
✓ Correct
-
C
Convert the Lakehouse to a Warehouse
-
D
Increase the cluster size of the Spark pool
Explanation
Creating materialized views or aggregation tables in a Lakehouse allows you to pre-compute expensive aggregations and significantly improve query performance.
What is the primary advantage of using the SQL Analytics Endpoint in a Lakehouse?
-
A
It eliminates the need for a separate Warehouse
-
B
It provides a read-only SQL interface to query Lakehouse data without requiring Spark
✓ Correct
-
C
It automatically encrypts all data at rest
-
D
It provides write access to Lakehouse tables using standard SQL INSERT statements
Explanation
The SQL Analytics Endpoint provides a read-only relational SQL interface over Lakehouse Delta tables, allowing traditional SQL tools and users to query data without Spark expertise.
You are configuring monitoring and alerting for your Fabric analytics solution. Which tool should you use to monitor the health and performance of your Fabric workspaces and items?
-
A
Fabric Capacity Metrics
✓ Correct
-
B
Power BI Embedded analytics
-
C
Azure Monitor only
-
D
Excel spreadsheet with manual tracking
Explanation
Fabric Capacity Metrics provide built-in monitoring dashboards and performance metrics for workspaces, items, and resource utilization within your Fabric capacity.
When using Power Query in Fabric, which transformation approach is most efficient for handling large datasets?
-
A
Manually script transformations in Python for maximum control
-
B
Perform all transformations in Power BI after importing data into Fabric
-
C
Apply all transformations in Power Query Desktop before loading to Fabric
-
D
Use native Query Folding to push transformations to the source database when possible
✓ Correct
Explanation
Query Folding in Power Query translates transformations into source-native queries (like SQL), significantly improving performance by filtering data at the source rather than loading and transforming in memory.
You are implementing a data quality framework in Fabric. Which feature should you use to validate data quality rules and track data quality metrics over time?
-
A
Manual SQL scripts in the SQL Analytics Endpoint
-
B
Data Quality monitoring in semantic models with automated alerts
-
C
Power BI report validation only
-
D
Dataflow Gen2 with custom validation logic in Power Query
✓ Correct
Explanation
Dataflow Gen2 allows you to implement data quality validations using Power Query transformations and can include quality checks and monitoring logic within your data pipeline.
In a Fabric Lakehouse, what does the 'Tables' section represent compared to the 'Files' section?
-
A
Tables are managed Delta tables with metadata and schema, Files are unmanaged data in /Files folder
✓ Correct
-
B
Tables are read-only, Files are read-write
-
C
Tables and Files are identical; the separation is only for organizational purposes
-
D
Tables are compressed CSV files, Files are uncompressed parquet files
Explanation
In a Lakehouse, the Tables section contains managed Delta tables with full metadata and schema information, while Files section stores unstructured or raw data in the /Files folder.
You need to create a semantic model that aggregates data from multiple tables in a Lakehouse for reporting purposes. What is the recommended approach?
-
A
Query directly from Files section using ad-hoc SQL queries
-
B
Create separate Power BI datasets for each source table
-
C
Create a single semantic model with a star schema using materialized tables from the Lakehouse
✓ Correct
-
D
Import all raw data into Excel and create a pivot table
Explanation
Creating a semantic model with a star schema design promotes reusability, maintainability, and optimal query performance for reporting across multiple data sources.
Which authentication method is recommended for service principals accessing Fabric resources programmatically?
-
A
Use only basic username/password authentication
-
B
Store credentials in plain text in configuration files
-
C
Use shared passwords distributed to all team members
-
D
Use Azure AD service principal with certificate-based authentication
✓ Correct
Explanation
Certificate-based authentication for Azure AD service principals is the most secure approach for programmatic access, avoiding password management and providing audit trails.
You are designing a real-time analytics solution. Which Fabric component should you use to ingest and analyze streaming data from IoT sensors?
-
A
Excel Online for real-time collaboration
-
B
Scheduled Dataflow Gen2 only
-
C
Static Data Factory pipeline
-
D
Eventstream with integration to downstream analytics components
✓ Correct
Explanation
Eventstream in Fabric enables real-time data ingestion from multiple sources and can connect to downstream components like Lakehouse or Warehouse for continuous analytics.
When organizing your semantic model, which approach best follows Fabric and Power BI best practices?
-
A
Create a separate semantic model for each report
-
B
Implement a shared semantic model with reusable measures in a central location
✓ Correct
-
C
Store all logic in Power BI reports rather than semantic models
-
D
Create multiple independent semantic models with duplicate measures and dimensions
Explanation
Creating shared semantic models with reusable measures and dimensions promotes consistency, reduces maintenance overhead, and enables self-service analytics across the organization.
What is the primary purpose of Shortcuts in Fabric Lakehouse?
-
A
To reference data from other Lakehouses or OneLake locations without copying
✓ Correct
-
B
To create keyboard shortcuts for faster data entry
-
C
To compress data for faster download speeds
-
D
To create backup copies of tables
Explanation
Shortcuts in Lakehouse allow you to create logical references to data stored in other locations (other Lakehouses, external cloud storage) without duplicating data, enabling data sharing and federation.
You need to implement column-level security on sensitive financial data in a Lakehouse. How should you approach this in Fabric?
-
A
Prevent all access to the Lakehouse
-
B
Delete sensitive columns from the Lakehouse tables
-
C
Use external data masking in Azure SQL Database only
-
D
Use semantic model object-level security (OLS) to hide columns from specific roles
✓ Correct
Explanation
Object-Level Security (OLS) in Power BI semantic models allows you to hide specific columns or tables from designated roles, providing column-level protection for sensitive data.
Which of the following best describes the relationship between OneLake and Lakehouse in Microsoft Fabric?
-
A
Lakehouse can only connect to external cloud storage, not OneLake
-
B
OneLake is a reporting tool, while Lakehouse is for data storage
-
C
OneLake and Lakehouse are completely separate components with no relationship
-
D
OneLake is Microsoft's cloud storage service, and Lakehouse is an abstraction layer on top of OneLake that provides database-like capabilities
✓ Correct
Explanation
OneLake is Fabric's underlying cloud storage service using ADLS Gen2 architecture, while Lakehouse provides a relational database abstraction over OneLake with tables, schema management, and metadata.
You are optimizing a Data Factory pipeline that loads data into a Lakehouse. The pipeline is currently running sequential activities. What optimization technique should you apply?
-
A
Increase the frequency of scheduled runs
-
B
Configure parallel activity execution where dependencies allow
✓ Correct
-
C
Move all activities to a single notebook
-
D
Disable monitoring to reduce overhead
Explanation
Configuring parallel execution of independent activities in a pipeline reduces overall execution time and improves pipeline efficiency by utilizing available compute resources.
What is the recommended approach for handling incremental data loads in a Fabric Lakehouse?
-
A
Implement Change Data Capture or use timestamps to load only new or modified records
✓ Correct
-
B
Manually identify changed records in Excel and update tables
-
C
Always perform full data replacement on each load
-
D
Use the Files section only and never update Tables
Explanation
Incremental loading using Change Data Capture or timestamp-based detection significantly reduces load times and data movement by processing only new or modified records.
You need to ensure that your Fabric analytics solution maintains GDPR compliance. Which security feature should you implement to address data residency requirements?
-
A
Use Power BI Premium licensing exclusively
-
B
Configure capacity in the appropriate geographic region and use network security groups
✓ Correct
-
C
Enable encryption at rest only
-
D
Implement encryption in transit only
Explanation
GDPR compliance requires data residency in specific geographic regions, which is achieved by provisioning Fabric capacity in the appropriate region and potentially implementing network controls.
When creating measures in a semantic model for a Fabric workspace, which calculation approach provides the best performance for large datasets?
-
A
Use calculated columns for all calculations
-
B
Use explicit DAX measures with optimized aggregation functions
✓ Correct
-
C
Perform all calculations in Power Query and store results as columns
-
D
Use implicit measures that aggregate raw columns
Explanation
Explicit DAX measures in semantic models provide superior performance through engine optimizations and context awareness, making them preferable to calculated columns for aggregations in large datasets.
Your Fabric solution requires integration with external APIs to enrich your data. Which approach should you use?
-
A
Manually download data from APIs and upload CSV files
-
B
Use Databricks only for API integration
-
C
Implement API calls directly in Dataflow Gen2 using custom connectors or Power Query Web.Contents function
✓ Correct
-
D
APIs cannot be integrated with Fabric
Explanation
Dataflow Gen2 supports custom connectors and Power Query functions like Web.Contents to call external APIs, enabling data enrichment and integration with third-party services.
You have created multiple reports and dashboards in a Fabric workspace and need to manage access carefully. What is the most effective approach for governing report distribution?
-
A
Use workspace roles and semantic model permissions to control access based on organizational structure
✓ Correct
-
B
Require users to request download of PDF exports only
-
C
Create duplicate reports for each user with embedded filters
-
D
Share all reports publicly to ensure maximum visibility
Explanation
Workspace roles (Admin, Member, Contributor, Viewer) combined with semantic model permissions provide granular governance allowing you to control data and report access at scale.
You are creating a lakehouse in Microsoft Fabric and need to configure delta table properties. Which format does Microsoft Fabric use for storing table metadata and transaction logs?
-
A
Avro schema with checkpoint files
-
B
Apache Parquet with JSON manifests
-
C
Delta Lake format with transaction log
✓ Correct
-
D
ORC format with ACID properties
Explanation
Microsoft Fabric uses Delta Lake format for lakehouse tables, which stores transaction logs in the _delta_log directory to maintain ACID compliance and data consistency.
You need to optimize a Power BI semantic model that is connected to a Fabric lakehouse. What is the primary benefit of using aggregation tables in this scenario?
-
A
Aggregation tables eliminate the need for refresh schedules
-
B
Aggregation tables cache pre-computed results to accelerate query performance
✓ Correct
-
C
Aggregation tables automatically enforce row-level security across all users
-
D
Aggregation tables reduce the size of the source lakehouse data files
Explanation
Aggregation tables in Power BI store pre-calculated summarized data, allowing queries to return results faster by querying the aggregated data instead of computing from raw data each time.
Your organization wants to stream real-time data into a Fabric lakehouse using Event Hubs. Which component of Microsoft Fabric enables the continuous ingestion and transformation of streaming data?
-
A
Power Query Online with refresh intervals
-
B
Data Factory pipelines with scheduled triggers
-
C
Eventstream connected to a lakehouse
✓ Correct
-
D
Dataflow Gen2 with incremental load
Explanation
Eventstream in Microsoft Fabric is the native component designed to ingest streaming data from sources like Event Hubs and route it to lakehouse tables in real-time or near-real-time.
You are designing a data warehouse in Fabric and need to implement slowly changing dimensions (SCD Type 2). Which table structure best supports this requirement?
-
A
A single fact table with versioned dimension keys and effective date ranges
-
B
A denormalized single table containing all dimension and fact attributes with timestamp columns
-
C
Multiple fact tables with separate dimension snapshots per date
-
D
A dimension table with surrogate keys, versioning columns, and effective/expiration dates to track historical changes
✓ Correct
Explanation
SCD Type 2 requires tracking historical changes by adding surrogate keys and date-based versioning columns (effective date, expiration date) to dimension tables, allowing queries to retrieve the correct version based on transaction date.
You are configuring row-level security (RLS) in a Power BI semantic model connected to Fabric. A user should only see sales data for their assigned region. What approach should you implement?
-
A
Use dataset parameters to dynamically filter by user role
-
B
Implement object-level security on specific tables in the lakehouse
-
C
Use Power Query to filter data during refresh for each user
-
D
Create RLS roles with DAX formulas that filter based on user credentials
✓ Correct
Explanation
Row-level security in Power BI is implemented through RLS roles containing DAX filter expressions that evaluate at query time based on the authenticated user's identity and assigned roles.
Your Fabric workspace contains multiple dataflows that process customer data. You notice refresh failures during peak hours. What is the most effective approach to resolve this?
-
A
Increase the workspace capacity SKU to provide more compute resources for dataflow execution
✓ Correct
-
B
Schedule dataflows to refresh during off-peak hours only
-
C
Convert all dataflows to use DirectQuery mode instead of import mode
-
D
Reduce the number of transformations in each dataflow step
Explanation
Dataflow refresh performance is directly tied to workspace capacity; increasing the capacity SKU provides more computational resources to handle concurrent refreshes and complex transformations.
You need to implement a solution where a Data Analyst can modify a Power BI report but should not have the ability to edit the underlying semantic model. Which workspace role should you assign?
-
A
Contributor
✓ Correct
-
B
Editor
-
C
Report reader
-
D
Viewer
Explanation
The Contributor role allows users to create and edit reports and dataflows but not modify semantic models or manage workspace settings, making it ideal for analysts who need report authoring without model access.
You are implementing a data pipeline in Fabric Data Factory that requires conditional logic—different transformations should run based on the row count of input data. Which activity type enables this?
-
A
Lookup activity that counts rows and returns the value to trigger downstream activities
-
B
Switch activity that routes execution paths based on parameter values
-
C
If Condition activity with expressions to evaluate the input dataset row count
✓ Correct
-
D
Filter activity that partitions data and executes parallel branches
Explanation
The If Condition activity in Data Factory evaluates Boolean expressions dynamically at runtime, allowing you to branch pipeline execution based on metadata like row counts retrieved from previous activities.
You are optimizing a Fabric lakehouse for analytics workloads. The external stakeholders need to access specific datasets without direct access to the lakehouse. What is the recommended approach?
-
A
Grant all stakeholders direct lakehouse read permissions and monitor usage
-
B
Create SQL analytics endpoints and publish Power BI semantic models for controlled access
✓ Correct
-
C
Export CSV files from the lakehouse and share via email
-
D
Use Power Query Online to create separate copies of data for each stakeholder
Explanation
SQL analytics endpoints and published semantic models provide governed, performant access to lakehouse data with built-in security, caching, and role-based controls without exposing raw data.
Your organization processes sensitive personal data in Fabric and must comply with data residency requirements. Which configuration ensures data remains within a specific geographic region?
-
A
Set the workspace region during creation and configure capacity in that region only
✓ Correct
-
B
Implement network security groups and firewall rules at the Fabric level
-
C
Enable encryption in transit and at rest for all Fabric artifacts
-
D
Use Dataflows with geographic routing parameters to limit data movement
Explanation
Data residency in Fabric is controlled by the workspace region and capacity region assignments; both must be in the same compliant geography to ensure data does not leave the required region.