This is all items relating to Microsoft Fabric Workloads
The Notebook "Get_Files_Tables_StorageSizes" will run on your existing lakehouse and get the files and tables storage sizes.
"Adding Date Time Column to Pyspark data frame" will add the current date time to an existing PySpark Data Frame.
"Azure Key Vault Auth with Service Principal" will authenticate from an Azure Key Vault with the Service Principals needed to run the Power BI or Fabric Admin APIs
"Blog - Update IR Policy" This will update your Incremental Refresh Policy based on the dates provided in the notebook.
"Reading Table from Another Lakehouse" Allows you to read a Lakehouse Table from a different App Workspace.
"Blog-Scanner API" Download the Scanner API Data to a JSON file in your Lakehouse.
"BLOG - Entra ID All Group Members" Gets all the Entra Groups and their Group Members and download it to a JSON file in your Lakehouse.
"Blog - Get Files and Table Sizes" Gets the storage size for all your files and table sizes for Lakehouses and Warehouses.
"Blog - Reading and Write different Lakehouses" Shows you how to read from a Lakehouse A in Workspace A and write to Lakehouse B in Workspace B
"Blog - Create case insensitive Warehouse" Has the code on how to create a case insensitive Warehouse.
"Blog - Create warehouse with Service Principal" Has the code on how to create a Warehouse using a service principle account.
"Blog - Get All Entra ID Groups and Users and Licenses" This gets all the Groups, Users and User licenses in your Tenant.
"Blog - Get All Fabric Items - Actual Pure Python" This gets all the Fabric Items in your tenant using a Python ONLY notebook.
"Blog - DuckDB SQL Code to read from Delta Tables" This Python notebook will show you how to read data from Lakehouse tables using SQL.
"Blog - Python - DuckDB - Writing to Lakehouse Table" This Python notebook will show you how to write data from a data frame to a Lakehouse Table using DuckDB SQL.
"Blog - Python - Run DAX Query and write to LH Table" This Python notebook will run a dax query against a semantic model and write the data frame to a Lakehouse table.
"Blog - Python - DuckDB - Querying Multiple Tables" Query multiple tables using a Python notebook using DuckDB and query a weather API.
"Blog - Python - DuckDB - Looping and write once" This notebook shows you how to loop through a stop dates and ramp the output once to a lake house table.
"BLOG - Show Table with IR Policy" Using Semantic Link Labs to connect to the Tabular Object Model and show a tables Incremental Refersh Policy.
"Blog - Export all Items to OneLake" Export all Fabric Items from your Workspace to OneLake and Azure Blob Storage.