SQL Server Reporting Services Integration with Power BI
Power BI Report Server can host Power BI reports on-premises, and you’ve learned about it in the previous post. There is another integration between SQL Server Reporting Services and Power BI service; this integration brings tiles from SSRS reports pinned to a Power BI dashboard with scheduled updates from SQL Server agent. The integration of SSRS reports into Power BI service, will create a link from Power BI dashboard to SSRS detailed reports.
In this post, you will learn what requirements to get SSRS report’s elements to be pinned to a Power BI dashboard are, and you will learn what the process of that in details is. Integrating SSRS reports into Power BI dashboards will create one single portal to access reporting solution. Users will be able to open Power BI dashboard and navigate from there to the detailed paginated SSRS report. If you want to learn more about Power BI, read Power BI book from Rookie to Rock Star .
Set Up Requirements
For SSRS Integration to work with Power BI, you will need;
- SQL Server 2016 or higher
- SQL Server Agent should be up and running
- Only charts, gauges, and maps can be pinned to Power BI Reports
- Power BI Integration should be enabled
- Stored Credentials should be used for the data sources of SSRS reports
Let’s look at these requirements through an example.
Prerequisites for this Example
To run this example, you will need to have SSRS installed with a sample SQL Server database of AdventureWorks. Unfortunately explaining the steps of installing SSRS or SQL Server database is big enough and outside of this topic to be involved in this post.
Enable Power BI Integration
The first action for the integration is to enable Power BI Integration in the Reporting Services Configuration Manager. Open Reporting Service Configuration Manager, and then go to Power BI Service tab at the left-hand side.
The Power BI Integration is only available in SQL Server Reporting Service 2016 or higher versions.
Click on Register with Power BI and log in to your Power BI account. After registering with your account, you will see a screen such as below;
Pin Report Elements into Power BI Dashboard
If you open an SSRS report, you will see the Power BI pin option at the top of the report in the reporting services portal.
You can click on the Power BI icon to start pinning elements into Power BI service. If you don’t have any charts in your report, you cannot pin any item on the dashboard. Only charts can be pinned.
If you try this item for the first time, you will be asked to log in with your Power BI username and password.
After signing in, you will be asked to authorize SSRS to access Power BI account information.
After going through the login process, you will see all items that you can pin to the dashboard, and you can click on them.
When you click on an element to be pinned, you will be asked which workspace, and dashboard you want the element to be pinned, and what is the refresh frequency.
An important part of pinning items to Power BI from SSRS is that your SQL Server Agent Service should be up and running. Agent service is responsible for updating the tile in the Power BI dashboard.
After successfully pinning the item, you will see a message explaining the process was successful.
You can then open the dashboard in Power BI, and you will see the chart element from SSRS pinned there.
The tile will be updated from SQL Server agent. Every time you click on the chart, you will be redirected to the SSRS report. If you click on edit details of tile, you can see the link to the SSRS report.
SQL Server Agent
As you learned earlier in this post, SQL Server agent is responsible for keeping that tile up to date in the Power BI Dashboard. After pinning the SSRS report element into Power BI, you can check SQL Server Agent and see that there is an agent job created for this process.
The SQL Server agent also has a schedule which can be configured differently.
This post was a quick post about how charts from an SSRS reports can be pinned into a Power BI Dashboard. For using this functionality, some requirements need to be met. You have to use SQL Server 2016 or higher version. Your dataset of the report should be using saved credentials. SSRS report should have charts, gauges, or maps, because only charts, gauges, and maps can be pinned to the dashboard. Power BI Integration in the reporting services configuration manager should be enabled. And finally; SQL Server agent should be up and running because Agent is responsible for keeping that tile up to date.
Using the integration of SSRS and Power BI; you can have tiles in Power BI dashboard which points to SSRS reports for the detailed paginated report. Power BI users will use normal tiles for interactive reports, and they can use SSRS tiles when they want to see the more detailed paginated report in SSRS. The integration of SSRS and Power BI creates a single portal to access all reporting items; which would be from Power BI dashboard.
2 thoughts on “ SQL Server Reporting Services Integration with Power BI ”
This is awesome. Can we include a .RDL report ( not only a visual) in our PBIX? My scenario is we have created reports using the SSRS (.RDL) and want to use them with the Power BI reports (parameters and filters). is it possible?
since RDL reports are accessible using URL, and you can even pass filters using URL query string parameters, you can create those links in Power BI, and by clicking on those it will navigate to RDL reports Cheers Reza
Leave a Reply Cancel reply
Search code, repositories, users, issues, pull requests...
We read every piece of feedback, and take your input very seriously.
Use saved searches to filter your results more quickly.
To see all available qualifiers, see our documentation .
- SQL Server training
- Write for us!
Reporting in SQL Server – Power BI Report Server
Power BI is a self-service business intelligence tool from Microsoft which has been steadily gaining momentum in the last couple of months. One of the well-known disadvantages of Power BI is that it is basically cloud only. A lot of companies are not yet at the point where they feel comfortable having their data in the cloud or are premise bound for some other reasons such as data-sensitivity, data-sovereignty or compliance.
Last year Microsoft released a Power BI on SQL Server Reporting Services preview for some then “future” version of SQL Server. This version basically allowed you to deploy your Power BI reports to SQL Server Reporting Services to share them with your organization. That version was based on the Power BI embedded which was extremely limited.
In answer to this problem Microsoft has now created Power BI Report Server.
Introducing Power BI Report Server
Power BI report server is a new component which comes as included in the licensing of SQL Server Enterprise Edition with SA or as a free extension of Power BI premium. It is not a replacement for SQL Server Reporting services as originally speculated.
Interestingly, if you do not have an on-premise SQL Server license but you purchased Power BI Premium and you want to run Power BI Report Service on-premise, you will have the right to run a server with the equivalent number of cores in your premium subscription for it.
Power BI Report Server Installation and Configuration
Installation of the Power BI Report Server is fairly straight forward. Just download and install following the wizard steps.
The only thing to note is that it does not install SQL Server, so you need to install one or have access to one before commencing your configuration. As you can see in the image below it says “Install Database Engine”, which is actually just a reminder that you need one.
Once the Power BI Report Server is installed, you need to configure it. When you open Report Server Configuration Manager, you will get a distinct sense of Déjà vu, since it looks exactly like Reporting Services Configuration Manager except for one additional option at the bottom: Power BI Service (cloud), which allows you to specify a Power BI account to connect to the Power BI Service in the cloud.
Power BI Reports
The advantage of being able to use Power BI reports is that you can create interactive reports. If you are familiar with Power BI, you will know that as a developer you can create datasets and allow your business users to create their own reports to change the visualizations and layouts to suit them without having to request that from development, saving everyone a ton of time.
Creating Power BI reports is where the catch comes in. Even if you are running everything on-premise and you are not using any Power BI in the cloud. You will still require a Power BI Pro license for each person who wants to author and publish reports.
To be able to publish reports to Report Server you need a special version of Power BI Desktop, currently known as Power BI Desktop (Report Server) which comes as part of the Power BI Report Server download, the version of which needs to be kept in sync with the version on Power BI Report server. As explained in the section up upgrades later on.
Power BI Report Server will allow you to view your SQL Server Reporting Services Reports as well as your Power BI reports through the Power BI app on any device, through the Power BI Report Server Web Portal.
The typical work flow would be:
Important things to note here is the data is not imported into the Power BI Service as with typical Power BI reports. Instead the Report Server connects to Analysis Services (in this case on-premise) live , to ensure that the data stays safe within the boundary of the customer’s firewall.
When Power BI Report Server is released initially it will only be able to connect to SQL Server Analysis Services (SSAS). Both multi-dimensional and tabular models are supported.
If you try to deploy a report which is connected to Excel for example you will encounter the below error:
Publishing to SQL Server Reporting Services (SSRS)
Even though deploying a report to Power BI Report Server is referred to as publishing, you won’t find Report Server under the Publish Menu.
You have to Save As… to save the report to the Report Server.
Migrating from SQL Server Reporting Services (SSRS)
You can migrate existing reports from your SQL Server Reporting Services to Power BI Report Server with minimal effort for native mode at least:
- Backup existing SQL Server Reporting Services Database
- Install the Power BI Reporting Server
- Restore Reporting Server Databases to Power BI report server instance
- Connect to Reporting Server Databases using the Report Server Configuration Manager.
If you are using SharePoint mode there is a script you can use which can be downloaded here .
To be able to install Power BI Report Server, you need at least:
- Windows Server 2012
- SQL Server 2008 or later
- SQL Analysis Services 2012 or later
In terms of licensing, as long as you have an active SQL Server license with Software Assurance, you are goo to go. Alternatively, you need Power BI Premium.
Unlike SQL Server which only gets updated with Service Packs and new versions. Power BI Report Server will get updated 3 times a year and have additional security patched in between if that should be required. This is known as the Modern Life Cycle Policy.
Since Power BI Report Server goes hand in hand with Power BI desktop, you need to ensure that the Power BI desktop version which will be used by all users will be upgraded at the same time as the Power BI Report Server, to ensure that the versions remain in sync.
If you are looking for more interactivity than what your typical SSRS paginated reports can provide, and you are not yet ready to go to the cloud; then Power BI Report Server is probably the right solution for you.
The preview release is now available for download on the Power BI website.
- Power BI Report Server Preview
- Reporting Services Web Portal
- Modern Life Cycle Policy
- Recent Posts
- The end is nigh! (For SQL Server 2008 and SQL Server 2008 R2) - April 4, 2018
- 8 things to know about Azure Cosmos DB (formerly DocumentDB) - September 4, 2017
- Introduction to Azure SQL Data Warehouse - August 29, 2017
- SSRS Report Builder introduction and tutorial
- Top SQL Server Books
- How to create a Word Cloud generator in Power BI Desktop
- How to embed a Power BI Report Server report into an ASP.Net web application
- How to create geographic maps in Power BI using R
All your data are belong to us.
Upgrade SQL Server SSRS to Power BI Report Server
Wanted to make a quick guide on upgrading SSRS Report Server to Power BI Report Server. This is an in-place upgrade, we will replace the existing SSRS Report Server with Power BI. These are the upgrade steps for SQL 2016--I can't say how different this process is on an older version though I can't image much. Power BI Report Server is a stand alone product now. Here are the steps you will need to follow.
- Take a full backup of your ReportServer and ReportServerTempDB databases
- Backup the SSRS Encryption Key and make sure you save the password you used to back it up.
- Stop the SSRS Report Server service
- Detach the ReportServer and ReportServerTempDB databases (we want these in case we need to roll back, your backups would work also, up to you).
- Restore the ReportServer and ReportServerTempDB databases you backed up, keep the same database names but give them new physical file names, for example, PowerBIReportServer.mdf.
- Install the PowerBI Report Server
- Start the service
- When you configure the http reservations, it will override the existing SSRS Report Server reservations. That is fine and what you want; you will see a warning when this happens, you can ignore it. If you had to roll back, when you uninstall the PowerBI Report Server it would release those reservations back to SSRS Report Manager.
- Point the service at the existing databases you just restored. Important note here: because we are effectively restoring SSRS Reporting database to the PowerBI Report Server, it's important that you don't change the names of these databases. Your SSRS report subscriptions will be looking for the original name of the databases, if this is changed your subscriptions will not work.
- Make sure you configure your email setting on the new server. If you fail to do this you will get an error when you try to open existing subscriptions on the server, because it is expecting those mail settings.
- Once you have finished configuring the server, restore the SSRS encryption key you backed up. This will restart the service at which point the upgrade is complete.
Not a lot to it, but there were a few things not mentioned in the Microsoft guide which is why I wanted to create this here. Keeping the database names the same and also the issue with email configuration and subscriptions. It's a pretty simple upgrade and the new PowerBI Report Server is a nice addition to the Microsoft BI stack.
Import Data from a Reporting Services Report
You can use a Reporting Services report that has been published to a SharePoint site or a report server as a data source in a Power Pivot workbook. The following procedure describes how to create the connection to the report and import the data to your workbook.
In this article
Prerequisites, choose an import approach, import report data using an address of a published report, import report data using a url to a data service document, export a report as a data feed, save an atom service document (.atomsvc) file for future import operations.
You must use a report definition (.rdl) file as a data source. Importing from a report model is not supported.
You must have permission to open the report under your Windows user account, and you must know the address of the report or the report server that hosts it. You can check your permissions by trying to open the report in a Web browser first. If the report opens, it confirms that you have sufficient permissions and the correct URL.
Top of Page
Report data is added once during import. A copy of the data is placed into the Power Pivot workbook. To pick up the latest changes to the underlying report data, you can either refresh the data from Power Pivot in Excel, or configure a data refresh schedule for the workbook after it is published to SharePoint.
You can use any of the following approaches to add Reporting Services report data to a Power Pivot workbook.
In the Power Pivot window, in the Home tab, click From Report . The Table Import wizard opens.
Click Browse and select a report server.
If you regularly use reports on a report server, the server might be listed in Recent Sites and Servers . Otherwise, in Name, type an address to a report server and click Open to browse the folders on the report server site. An example address for a report server might be http://<computername>/reportserver.
Select the report and click Open . Alternatively, you can paste a link to the report, including the full path and report name, in the Name text box. The Table Import wizard connects to the report and renders it in the preview area.
If the report uses parameters, you must specify a parameter or you cannot create the report connection. When you do so, only the rows related to the parameter value are imported in the data feed.
Choose a parameter using the list box or combo box provided in the report.
Click View Report to update the data.
Note: Viewing the report saves the parameters that you selected together with the data feed definition.
Optionally, click Advanced to set provider-specific properties for the report.
Click Test Connection to make sure the report is available as a data feed. Alternatively, you can also click Advanced to confirm that the Inline Service Document property contains embedded XML that specifies the data feed connection.
Click Next to continue with the import.
In the Select Tables and Views page of the wizard, select the check box next to the report parts that you want to import as data.
Some reports can contain multiple parts, including tables, lists, or graphs.
In the Friendly name box, type the name of the table where you want the data feed to be saved in your Power Pivot workbook.
The name of the Reporting Service control is used by default if no name has been assigned: for example, Tablix1, Tablix2. We recommend that you change this name during import so that you can more easily identify the origin of the imported data feed.
Click Preview and Filter to review the data and change column selections. You cannot restrict the rows that are imported in the report data feed, but you can remove columns by clearing the check boxes. Click OK .
In the Select Tables and Views page, click Finish .
When all rows have been imported, click Close .
An alternative to specifying a report address is to use a data service document (.atomsvc) file that already has the report feed information you want to use. A data service document specifies a URL to the report. When you import the data service document, a report feed is generated from the report and added to the Power Pivot workbook.
In the Power Pivot window, in the Home tab, click From Data Feeds . The Table Import wizard opens.
In the Connect to a Data Feed page, type a friendly name to use when referring to the data source.
This name is used only within the Power Pivot workbook to refer to the data source. Later in the wizard, you will set the name of the table where the data is stored.
Type a path to the data service document (.atomsvc) file that specifies the report feed. You can specify an address to the document if it is stored on server, or you can open it from a folder on your computer. Alternatively, you can click Browse to navigate to a server that has the data service document you want to use.
Click Test connection to make sure a feed can be created using the information in the data service document.
Click Next .
The name of the Reporting Service control is used by default if no name has been assigned: for example, Tablix1, Tablix2. We recommend that you change this name during import so that you more easily can identify the origin of the imported data feed.
Open a report from Report Manager, SharePoint, or a report server.
If Excel is installed on your computer, you will be prompted to open or save the file.
Click Open to immediately view the imported data in the Power Pivot window in Excel.
If the button is not visible, the report is not running on a supported version of Reporting Services. Consider moving or copying the report to a report server that is a supported release.
Note: Reporting Services includes an Atom rendering extension that generates the feeds from report definition files. That extension, rather than Power Pivot server software, creates report feeds and data service documents used to export report data to Power Pivot workbooks. For more information using feeds in Power Pivot, see Power Pivot Data Feeds on MSDN.
If you do not have an application on your computer that can open a report feed, save the document for future use on a computer that has Power Pivot in Excel. The document that you save specifies an address to the report. It does not contain data from the report.
Click Save to store the .atomsvc file on your computer. The file specifies the report server and location of the report file.
To use the .atomsvc file later, you can open it in Power Pivot in Excel to import the report feed. For more information about how to specify a data service document for report feeds, see Import report data using a URL to a data service document in this topic.
You can also publish this file to a data feed library on SharePoint to make it available to anyone who wants to use report feeds in other workbooks or reports. For more information about data feed libraries, see Power Pivot Data Feeds on MSDN.
Need more help?
Want more options.
Explore subscription benefits, browse training courses, learn how to secure your device, and more.
Microsoft 365 subscription benefits
Microsoft 365 training
Communities help you ask and answer questions, give feedback, and hear from experts with rich knowledge.
Ask the Microsoft Community
Microsoft Tech Community
Microsoft 365 Insiders
Was this information helpful?
Thank you for your feedback.
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Install Power BI Report Server
- 10 contributors
Learn how to install Power BI Report Server.
Another option is to create a virtual machine (VM) with a Power BI Report Server Enterprise Image on Windows Server 2019 from Azure Marketplace.
Download Power BI Report Server
On the On-premises reporting with Power BI Report Server page, select Download free trial .
When you run the PowerBIReportServer.exe file, you select the free trial or you enter your product key. Read on for details.
Before you install
Before you install Power BI Report Server, we recommend you review the Hardware and Software Requirements for installing Power BI Report Server .
While you can install Power BI Report Server in an environment that has a Read-Only Domain Controller (RODC), Power BI Report Server needs access to a Read-Write Domain Controller to function properly. If Power BI Report Server only has access to a RODC, you may encounter errors when trying to administer the service.
Power BI Report Server product key
You can get the product key for Power BI Report Server from two different sources:
Power BI Premium
Sql server enterprise software assurance (sa).
Read on for details.
If you've purchased Power BI Premium, within the Premium settings tab of the Power BI admin portal, you have access to your Power BI Report Server product key. The admin portal is only available to Global Admins or users assigned the Fabric administrator role.
Selecting Power BI Report Server key displays a dialog containing your product key. You can copy it and use it with the installation.
If you have a SQL Server Enterprise SA agreement, you can get your product key from the Volume Licensing Service Center .
When installing Power BI Report Server on multiple servers for a scale-out scenario, all servers must use the same Power BI Premium product key or SQL Server Enterprise Software Assurance (SA) product key.
Install your report server
Installing Power BI Report Server is straightforward. There are only a few steps to install the files.
You don't need a SQL Server Database Engine server available at the time of install. You will need one to configure Reporting Services after install.
Find the location of PowerBIReportServer.exe and launch the installer.
Select Install Power BI Report Server .
Choose an edition to install and then select Next .
Choose either Evaluation or Developer edition.
Otherwise, enter the product key that you got from either the Power BI service or the Volume License Service Center. For more information about how to get your product key, see the Before you install section above.
Read and agree to the license terms and conditions, then select Next .
You need a Database Engine available to store the report server database. Select Next to install the report server only.
Specify the install location for the report server. Select Install to continue.
The default path is C:\Program Files\Microsoft Power BI Report Server.
After a successful setup, select Configure Report Server to launch the Reporting Services Configuration Manager.
Configure your report server
After you select Configure Report Server in the setup, you're presented with Reporting Services Configuration Manager. For more information, see Reporting Services Configuration Manager .
To complete the initial configuration of Reporting Services, you create a report server database . A SQL Server Database server is required to complete this step.
Creating a database on a different server
If you're creating the report server database on a database server on a different machine, change the service account for the report server to a credential that is recognized on the database server.
By default, the report server uses the virtual service account. If you try to create a database on a different server, you may receive the following error on the Applying connection rights step.
System.Data.SqlClient.SqlException (0x80131904): Windows NT user or group '(null)' not found. Check the name again.
To work around the error, you can change the service account to either Network Service or a domain account. Changing the service account to Network Service applies rights in the context of the machine account for the report server.
For more information, see Configure the report server service account .
A windows service is created as part of the installation. It is displayed as Power BI Report Server . The service name is PowerBIReportServer .
Default URL reservations
URL reservations are composed of a prefix, host name, port, and virtual directory:
An example of the complete URL string might be as follows:
- https://+:80/reportserver , provides access to the report server.
- https://+:80/reports , provides access to the web portal.
If you're accessing the report server from a remote machine, make sure you've configured any firewall rules if there is a firewall present.
Open up the TCP port that you've configured for your Web Service URL and Web Portal URL. By default, they're configured on TCP port 80.
- To configure integration with the Power BI service so you can pin report items to a Power BI dashboard, see Integrate with the Power BI service .
- To configure email for subscriptions processing, see E-Mail settings and E-Mail delivery in a report server .
- To configure the web portal so you can access it on a report computer to view and manage reports, see Configure a firewall for report server access and Configure a report server for remote administration .
- For details on setting report server system properties in SQL Server Management Studio, see Server Properties Advanced Page . Unless it specifies otherwise, the options apply to both Power BI Report Server and SQL Server Reporting Services.
Administrator overview How to find your report server product key Install Power BI Desktop for Power BI Report Server Verify a Reporting Services installation Configure the report server service account Configure report server URLs Configure a report server database connection Initialize a report server Configure SSL connections on a report server Configure windows service accounts and permissions Browser support for Power BI Report Server
More questions? Try asking the Power BI Community
Submit and view feedback for
Cloud Training Program
Learn Cloud With Us
Top 15 Differences Between SSRS vs Power BI
December 22, 2022 by jaya mehta Leave a Comment
Why Microsoft created Power BI when it already has SSRS? What is the difference between the two? What are their significant features? You might have this set of questions in your mind when you think of Microsoft SSRS vs Power BI . So this blog will help you clear all your dilemmas.
Topics covered in the blog:
What is Power BI?
What is ssrs.
- Difference between SSRS vs Power BI
- Why Power BI better than SSRS?
Before moving specifically to compare the two, let’s have a deep dive into Power BI and SSRS.
Power BI is a Business Intelligence and Data Visualization tool that help transform raw data from various data sources into meaningful and valuable insights. These insights can be in different forms like graphs, charts, dashboards, and reports. In addition, Power BI offers cloud-based services that help to analyse the data and create interactive visualisations.
Note: Learn how to create reports in Power BI.
SQL Server Reporting Services is a server-based reporting platform. It contains a programming and coding interface to build, test and deploy the report. They are mainly used for generating, sharing, and viewing reports. It demands manual assistance due to its programming foundation. It is also associated with Visual Studio and SQL Formatting. To gain in-depth detail of SSRS, click here .
The various reports generated in SSRS are cached, linked reports, parameterised, ad-hoc, snapshot, click-through, and drill-down & drill-through reports.
Read More: About Power Bi Data Analysis . Click here
Difference between SSRS and Power BI
Now comes the most awaited part where we compare both these Microsoft tools – SSRS vs Power BI.
SSRS is old technology for generating and publishing reports based on the programming and coding interface. In contrast, Power BI is more friendly, advance, easy to use and has more graphical and visualisation tools.
As we had an overview of both, it will become easy to understand the comparison based on the following criteria. So here are the top 15 difference between SSRS vs Power BI.
Also Check: Our blog post on Power Bi Paginated Reports . Click here
Why Power BI is better than SSRS?
The reason for Power BI – being a hot choice for many professionals, be it a Data Engineers, Database Administrator, Data Analyst and Business Intelligence Professionals are due to its superior capabilities, which are listed below:
- It offers real-time updates of reports and dashboards backed up by Artificial Intelligence, Machine Learning, NLP, and Microsoft Digital Assistant, Cortana.
- Data for Power BI can be fetched from almost any sources like Excel, SQL, CSV, Web files, Java APIs, etc.
- The SaaS solution provides built-in reports and dashboards.
- It provides Data Cleaning, data transmission, data modelling, and data visualisation capabilities, all under one umbrella.
- Power BI provides you with a secure connection (Row-level Security) both on-premises or in the cloud.
- It readily integrates with other tools like Dynamics 365, SharePoint, Office 365, Hadoop, SAP, MailChimp, Salesforce, Spark, etc.
Thanks for reading the article.
As we have done a thorough analysis, we know that SSRS is more inclined towards coding. So if you have a technical background and have a sound understanding of coding and programming, SSRS will work for you. But Power BI is easier to use due to its drag-and-drop and great visualisation tools. Power BI is technologically advanced and has more graphical capabilities than SSRS.
- Introduction to Microsoft Power BI Platform | Everything You Must Know
- Exam PL-300: Microsoft Power BI Data Analyst | A Replacement Of DA-100
- Microsoft Certified Data Analyst Associate [PL-300] Step By Step Activity Guides (Hands-On Labs)
- Case Study: Display Live Data Streaming Using Power BI Service
- Introduction to Data Modeling in Power BI
- What is Power Query | Microsoft Azure Power BI Tool
Next Steps to begin with PL-300 Certification:
In our PL-300 Certification Training Program, we’ll cover 10+ Hands-On Labs. If you wish to start your journey towards becoming a Microsoft Certified: Power BI Data Analyst Associate , try our FREE CLASS.
Share This Post with Your Friends over Social Media!
Leave a Reply Cancel reply
Your email address will not be published. Required fields are marked *
"Learn Cloud From Experts"
- Partner with Us
- Terms and Conditions
- Docker and Kubernetes Job Oriented Program
- AWS Job Oriented Program
- Azure Job Oriented Program
- Oracle Cloud Job Oriented Program
- Terraform Job Oriented
Get in touch with us
8 Magnolia Pl, Harrow HA2 6DS, United Kingdom
Email : [email protected]
Solving SQL Server Performance Issues for 18 Years and Counting
By: Jeremy Kadlec | Updated: 2023-11-13 | Comments | Related: More > Performance Tuning
SQL Sentry is an excellent example of the adage "necessity is the mother of all inventions." SQL Sentry originated from the needs of DBAs managing SQL Servers in a hosting company that could not find a solution to monitor their growing environment. Since that idea sparked, SQL Sentry has evolved into a solution used around the globe to quickly and accurately resolve performance issues for both the relational database engine and Microsoft analytics platforms implemented both on-premises and in the cloud.
A lot has changed over the last 18 years since SQL Sentry was first launched. How can the recent releases of SQL Sentry help me, and what is next to come?
Over the years, SQL Sentry became integral to a DBA's toolkit for resolving SQL Server performance issues. Recently, additional features have been introduced that can truly benefit DBAs globally.
SQL Sentry's Roadmap
In 2023, SQL Sentry has had multiple releases, and more are planned for 2024. Recent releases include:
- Performance - Support for deeper performance insights with wait stats and Plan Explorer support
- Monitoring – SQL Server 2022, instance level health score, and server group health score
- Cloud – RDS Agent Jobs, Azure Synapse SQL Pools, and RDS blocking monitoring
- Portal – VMware dashboard, custom charts, Top SQL enhancements, storage visualization, and more
SolarWinds is also working on new functionality for future releases, including:
- Performance – Session level wait analytics, wait time anomaly detection
- Integration – Hybrid Cloud Observability
- Cloud – Azure Elastic Pools and AWS CloudWatch metrics
- Portal – Event Calendar and Always-On support
For a complete roadmap, visit SQL Sentry Releases .
SQL Sentry's Modern Interface
One of the most eye-catching changes with SQL Sentry is the Portal. Historically, SQL Sentry's interface was a thick client running on a desktop. However, the development team has moved new functionality to the browser-based portal to improve usability and deliver a modern platform. Many of the new features are being released only to the Portal, in addition to the ability to create custom dashboards.
Check out all the SQL Sentry Web Portal Features .
New Environmental Health Score in SQL Sentry
SQL Sentry has included a Health Score for the entire environment for years and now provides a Health Score across groups of servers and individual servers. The Health Score now gives trends over the last 30 days with deltas for recent changes to determine if recent changes have been helpful or detrimental. SQL Server instances can also be categorized as Critical, Unhealthy, At Risk, or Healthy to understand the environment better and focus resources.
Learn more about the Environmental Health Score .
Insights into SQL Server TempDB
TempDB is often a point of contention for numerous applications. This shared system database can be a source of confusion with multiple databases on a single instance creating objects from varying workloads and user demands. Often, TempDB can be the culprit of performance issues, with the source of the problem in a database that is not perceived as operating slowly. To address these scenarios, SQL Sentry has included a new interface focused on TempDB to better understand the source application, activity, sessions, memory usage, and more.
Learn more about TempDB Analysis in SQL Sentry .
Top SQL Interface Includes SQL Server Wait Stats
SQL Server Wait Stats are now included in the Top SQL interface to give DBAs and Developers insight into what (Disk, Memory, CPU, Network, etc.) their queries are waiting on inside the relational database engine. This information, used with additional metrics like duration, count, CPU, memory, reads, writes, etc., can help identify problematic queries that negatively impact the SQL Server instance.
Learn more about Top SQL and Query Analysis .
- Learn more about SQL Sentry features for SQL Server DBAs.
- Get your SQL Sentry free trial to find your biggest performance challenges.
- Check out a public demo of the SQL Sentry Portal with SQL Server instances on-premises and in the cloud.
About the author
Comments For This Article
Download product trial sql sentry.
Powered by Microsoft Translator, the Book of News now has translations available to assist in reading content in languages other than English. Microsoft Translator relies on data and technology to provide its translation. Because translations are generated by Azure AI-powered machine translation they may not always be perfect or exact and should be used as an approximation of the original English language content.
BOOK OF NEWS
November 15 - 17, 2023, introduction.
- Share on Twitter (opens new window)
- Share on LinkedIn (opens new window)
- Share on Facebook (opens new window)
Foreword from Frank X. Shaw
Welcome once again to Microsoft Ignite and this year’s edition of the Book of News. It’s an action-packed version that features more than 100 announcements in a wide range of topics, including infrastructure, Microsoft Copilot, the relationship between data and AI, new tools for developers and security.
At this year’s Microsoft Ignite, our flagship event for IT developers and business decision makers, we are expecting 4,500 in-person attendees in Seattle and more than 175,000 registrants who are participating digitally. Everyone will learn about new products and updates launching today and hear from senior leaders and subject matter experts about what’s on the horizon.
The Book of News is designed to be your guide to all our announcements, making it easy for you to navigate the latest information and provide key details on the topics in which you are most interested. We are excited to share some groundbreaking new products and critical updates that help make work and life easier and more productive.
The overarching theme for this year’s Ignite is how we are working to empower our customers, partners and developers to thrive in the era of AI. In 2023, we witnessed entirely new ways of working via technological advances. Organizations count on their partners to provide innovative, efficient and safe solutions that lead to meaningful business outcomes, and we at Microsoft are proud to deliver those results.
We have a great lineup of news and exciting moments planned for this year’s Microsoft Ignite. I hope that you can join us.
As usual, we eagerly want your feedback. Please let us know how we can do better. We want to make sure you receive the information and context you need from this event. What can we do to make the experience even better next time?
What is the Book of News?
The Microsoft Ignite Book of News is your guide to key news items that we are announcing at Microsoft Ignite. The interactive Table of Contents gives you the option to select the items you are interested in, and the translation capabilities make the Book of News more accessible globally. (Just click the Translate button below the Table of Contents to enable translations.)
We pulled together a folder of imagery related to a few of the news items. Please take a look at the imagery here . To watch keynotes and sessions related to news items, we have links below the news to get you quick access to upcoming sessions and on-demand videos.
We hope the Book of News provides all the information, executive insight and context you need. If you have any questions or feedback regarding content in the Book of News, please email [email protected] .
If you are interested in speaking with an industry analyst about news announcements at Microsoft Ignite or Microsoft’s broader strategy and product offerings, please contact [email protected] .
1. Azure 1.1. Azure AI Services
1.1.1. azure machine learning updates streamline and operationalize ai.
Azure Machine Learning continues to improve user experiences with new enhancements, including the general availability of prompt flow and model catalog and the preview of an integration with OneLake in Microsoft Fabric, empowering developers and machine learning professionals to streamline the development of AI-powered applications and operationalize responsible generative AI solutions across all stages of the generative AI development lifecycle.
Updates to Azure Machine Learning include:
- Prompt flow streamlines the entire development lifecycle of applications powered by large language models (LLMs). It enables developers to design, construct, evaluate and deploy LLM workflows, connecting to a variety of foundation models, vector databases, prompts and Python tools through both visualized graphs and code-first experiences in CLI, SDK and Visual Studio Code extension. Prompt flow is now generally available in Azure Machine Learning and in preview in Azure AI Studio.
- Model catalog will empower users to discover, evaluate, fine-tune and deploy foundation models from renowned providers, such as Hugging Face, Meta and OpenAI, facilitating developers in selecting the optimal foundation models for their specific use cases. Within the model catalog, users can find a comprehensive comparison view for benchmarking metrics of multiple foundation models, allowing users to self-educate and make informed decisions about the suitability of models and datasets for their specific use cases. Model catalog has expanded to include new models like Code Llama, Stable Diffusion and OpenAI’s CLIP (Contrastive Language-Image Pretraining) models. Model catalog will be generally available soon and is available in preview in Azure AI Studio, broadening its availability and applicability.
- Model-as-a-Service through inference APIs and hosted-fine-tuning , coming soon in preview, will enable developers and machine learning professionals to easily integrate foundation models such as Llama 2 from Meta, upcoming premium models from Mistral, and Jais from G42 as an API endpoint to their applications and fine-tune models without having to manage the underlying GPU infrastructure.
- OneLake is now available in preview as a datastore in Azure Machine Learning, facilitating a seamless transition between Microsoft Fabric and Azure Machine Learning. This integration allows data engineers to share machine learning-ready data assets developed in Fabric, enabling machine learning professionals to directly utilize them for model training in Azure Machine Learning. Additionally, machine learning professionals can write model predictions back to OneLake for further processing in Fabric or to surface insights through Power BI.
Additional resources :
- Blog: Ignite 2023: What’s new in Azure AI Platforms – Charting the Future with Innovative AI and ML
- Keynote: Microsoft Cloud in the era of AI
- Breakout: End-to-End AI App Development: Prompt Engineering to LLMOps
- Breakout: Evaluating and Designing Responsible AI Systems for The Real World
- Demo: Llama 2 inference APIs & hosted fine-tuning in Azure AI model catalog
- Demo: Monitor ML and GenAI apps for safety and quality in production
- Demo: Operationalize ML Lifecycle with managed feature store
- Demo: Transform your AI workflows with RAG in prompt flow
1.1.2. Microsoft launching AI platform Azure AI Studio
Microsoft is launching the preview of its unified AI platform, Azure AI Studio , which will empower all organizations and professional developers to innovate and shape the future with AI.
The platform, accessibly and responsibly designed, will equip organizations with a one-stop shop to seamlessly explore, build, test and deploy AI solutions using state-of-the-art AI tools and machine learning models, all grounded in responsible AI practices. Developers will be able to build generative AI applications, including copilot experiences, using out-of-the-box and customizable tooling and models.
Users can choose the data source, including Microsoft Fabric OneLake and Azure AI Search, for vector embeddings, select models from a comprehensive catalog of frontier and open-source models, orchestrate prompt flows, evaluate model responses, identify fine-tuning opportunities and scale proof of concepts into full production with continuous monitoring and refinement.
- Blog: Unleashing the Power of Generative AI: Azure AI Studio Leads the Way
- Blog: Driving inclusive AI innovation with Azure AI Studio .
- Blog: Driving AI Innovation Across Digital Landscapes – Azure AI and Microsoft Fabric Integration.
- Demo: AI-powered Search: From Prototypes to Production with Azure AI Studio
- Discussion: Building Copilots for Enterprise Experiences Q&A
- Discussion: Empowering Inclusive Innovation with Azure AI Studio Q&A
- Breakout: Build your own Copilot with Azure AI Studio
- Breakout: What’s New in Generative AI?
1.1.3. New capabilities for developers to build generative AI solutions safely, responsibly
Microsoft leads the industry in the safe and responsible use of AI. The company has set the standard with an industry-leading commitment to defend and indemnify commercial customers from lawsuits for copyright infringement with the Copilot Copyright Commitment (CCC). Today, Microsoft takes its commitment one step further by announcing the expansion of the CCC to customers using Azure OpenAI Service. The new benefit will be called the Customer Copyright Commitment. As part of this expansion, Microsoft has published new documentation to help Azure OpenAI Service customers implement technical measures to mitigate the risk of infringing content. Customers will need to comply with the documentation to take advantage of the benefit.
And Azure AI Content Safety, now generally available, helps organizations detect and mitigate harmful content and create better online experiences. Customers can use Azure AI Content Safety as a built-in-safety system within Azure OpenAI Service, for open-source models as part of their prompt engineering in Azure Machine Learning or as a standalone API service.
- Download: Visual assets
- Blog: Microsoft Azure AI, data, and application innovations help turn your AI ambitions into reality
- Learn: Customer Copyright Commitment Required Mitigations
- Keynote: Microsoft Ignite opening
1.1.4. New features for Azure AI Vision
Azure AI Vision offers innovative computer vision capabilities to empower developers to analyze images, read text and detect faces with pre-built image tagging, text extraction with optical character recognition (OCR) and responsible facial recognition.
Several new updates to the solution, including:
Liveness functionality and Vision SDK: Liveness functionality will help prevent face recognition spoofing attacks and conforms to ISO 30107-3 PAD Level 2. Vision SDK for Face will enable developers to easily add face recognition and liveness to mobile applications. Both features are in preview.
Image Analysis 4.0: This API introduces cutting-edge Image Analysis models, encompassing image captioning, OCR, object detection and more, all accessible through a single, synchronous API endpoint. Notably, the enhanced OCR model boasts improved accuracy for both typed and handwritten text in images. Image Analysis 4.0 is generally available.
Florence foundation model: Trained with billions of text-image pairs and integrated as cost-effective, production-ready computer vision services in Azure AI Vision, this improved feature enables developers to create cutting-edge, market-ready, responsible computer vision applications across various industries. Florence foundation model is generally available.
- Blog: Are You Alive: Enhancing Azure AI Vision Face API with Liveness Detection
- Blog: Announcing general availability of Azure AI Vision Image Analysis 4.0 API
- Lab: Implement a Computer Vision Solution with Azure AI Vision
1.1.5. New multimodal AI capabilities now available in Azure OpenAI Service
Azure OpenAI Service has unveiled several multimodal AI capabilities to empower businesses to build generative AI experiences with image, text and video. They include:
- DALL·E 3 : Imagine an AI model that can generate images from text descriptions. DALL·E 3 is a remarkable AI model that does just that. Users describe an image, and DALL·E 3 will be able to create it. DALL·E 3 is in preview.
- GPT-3.5 Turbo model with a 16k token prompt length and GPT-4 Turbo: The latest models in Azure OpenAI Service will enable customers to extend prompt length and bring even more control and efficiency to their generative AI applications. Both models will be available in preview at the end of November 2023.
- GPT-4 Turbo with Vision (GPT-4V): When integrated with Azure AI Vision, GPT-4V will enhance experiences by allowing the inclusion of images or videos along with text for generating text output, benefiting from Azure AI Vision enhancement like video analysis. GPT-4V will be in preview by the end of 2023.
- GPT-4 updates: Azure OpenAI Service has also rolled out updates to GPT-4, including the ability for fine-tuning. Fine-tuning will allow organizations to customize the AI model to better suit their specific needs. It’s akin to tailoring a suit to fit perfectly, but in the world of AI. Updates to GPT-4 are in preview.
These advancements in Azure OpenAI Service open new possibilities for businesses and users alike. With DALL·E 3 and GPT-4 Turbo for Vision, creativity knows no bounds, and communication with machines becomes more intuitive. The availability of GPT-3.5 Turbo with a 16K token prompt length, and GPT-4 Turbo, along with updates to GPT-4, will enable improved adaptability and efficiency, making it even more useful across various industries.
- Blog: GPT-4 Turbo with Vision on Azure OpenAI Service
1.1.6. New summarization and translation capabilities in Azure AI
Several new features in Azure AI will aid developers when summarizing and translating language for app usage. Updates, now in preview, include:
- A new task-optimization summarization capability in Azure AI Language, powered by large language models (GPT-3.5-Turbo, GPT-4, Z-Code++ and more).
- A new machine translation model capable of translating from one language to another without translating in English as an intermediary. In addition, it can be customized using customer data to better align translations to the industry’s context.
- Named entity recognition , document translation and summarization in containers will allow government agencies and industries, such as financial services and healthcare, with strict data residency requirements to run AI services on their own infrastructure.
- Personal voice , a new custom neural voice feature that will enable businesses to create custom neural voices with 60 seconds of audio samples for their users. Personal voice is a limited access feature .
- Text-to-speech avatar , a new text-to-speech capability that will generate a realistic facsimile of a person speaking based on input text and video data of a real person speaking. Both prebuilt and custom avatars are now in preview, however, custom avatar is a limited access feature.
- Blog: Empowering developers to use natural language capabilities using containers
- Blog: Azure AI Speech launches Personal Voice in preview
- Blog: Azure AI Speech announces public preview of text to speech avatar
- Demo: Next-gen summarization powered by generative AI
- Demo: Building brands with Azure OpenAI Service and Azure AI Speech
1.1.7. Unlock video insights with updates in Azure OpenAI
The powerful integration of Azure AI Video Indexer, Azure AI Search and Azure OpenAI Service offers users a comprehensive solution for capturing essential insights from video content and enables natural language question-answering, video summarization and efficient content search. These new features, now in preview, include:
- Video-to-text summary: Users will be able to extract the essence of video content and generate concise and informative text summaries. The advanced algorithm segments videos into coherent chapters, leveraging visual, audio and text cues to create sections that are easily accommodated in large language model (LLM) prompt windows. Each section contains essential content, including transcripts, audio events and visual elements. This is ideal for creating video recaps, training materials or knowledge-sharing.
- Efficient Video Content Search: Users will be able to transform video content into a searchable format using LLMs and Video Indexer’s insights. By converting video insights into LLM-friendly prompts, the main highlights are accessible for effective searching. Scene segmentation, audio events and visual details further enhance content division, allowing users to swiftly locate specific topics, moments or details within extensive video.
- Email: Contact the Microsoft Media and Analyst Events Team for more information
1.1.8. Vector search and semantic ranker now generally available in Azure AI Search
Azure AI Search (formerly known as Azure Cognitive Search) is an information search and retrieval platform that enables organizations to deliver highly personalized experiences in their generative AI applications.
Updates to Azure AI Search include:
- Vector search : Large language models (LLMs) traverse large volumes of documents and information to generate responses to user queries. This is expensive and can result in slow response times. Given computers are faster and more efficient when working with numerical data compared to documents, techniques have been developed that turn documents and data into a numerical format, called a vector — a long floating point. Vector search indexes allow for faster, more efficient retrieval. Having great vector support is as important to getting a good answer out of an LLM as the LLM itself. Vector search is generally available.
- Semantic ranker (formally known as semantic search): With multilingual, deep learning models adapted from Microsoft Bing, semantic ranker prioritizes and ensures the most relevant search results are delivered first. Semantic ranker is generally available.
- Availability in Azure AI Studio: Azure AI Search is now available in Azure AI Studio, a new unified AI platform, currently in preview. A robust search and retrieval system is a critical component of the generative AI systems’ development lifecycle. Therefore, making Azure AI Search available in Azure AI Studio will enable support for the full value chain, streamlining workflow for the app developer by putting everything they need in one place.
- Blog: Announcing general availability of vector search and semantic ranker in Azure AI Search
- Discussion: Vector Search: Retrieval Augmented Generation and Generative AI Apps
- Demo: Revolutionizing Search Relevance with Semantic Ranking
- Breakout: Vector search and state of the art retrieval for Generative AI apps
1.2. Azure Compute
1.2.1. new amd-based azure virtual machines now in preview.
Azure is introducing the latest AMD-based virtual machines (VMs) built on the 4th Generation AMD EPYC™ Genoa processor. Now in preview for the D, E and F family VMs, these new VMs introduce even greater performance and reliability than the previous AMD v5 VMs based on the 3rd Generation AMD EPYC™ Milan processor. The new Genoa-based VMs will have different memory-to-core ratios spread across the three VM series – the general purpose Dav6 and Dalv6 series, the memory-optimized Eav6 series and the compute-optimized Fav6, Falv6 and Famv6 series.
The Dav6 VM series provides a good balance between memory-to-core ratio, while the Dalv6 series is meant to provide a more cost-effective option for applications that require less memory. The Eav6 VM series is built for applications demanding higher memory-to-core ratios. The Fav6, Falv6 and Famv6 series all have increased CPU performance in comparison to the D and E series AMD VMs and are different only in their memory-to-core ratio.
These new VMs will significantly expand the VM options available for AMD customers, spanning a multitude of popular VM series of varying memory sizes and budget.
- Blog: Public Preview: New AMD-based VMs with Increased Performance, Azure Boost, and NVMe Support
- Breakout: What’s new and what’s next with Azure IaaS
1.2.2. Running SAP HANA on Azure with new and powerful infrastructure options
The Azure M-series Mv3 family , the next generation of memory optimized virtual machines (VMs), gives customers faster insights, more uptime, lower total cost of ownership and improved price-performance for running SAP HANA workloads with direct Azure IaaS deployments and SAP RISE on Azure. The Mv3 VMs are powered by the 4th-generation Intel® Xeon® Scalable processors and Azure Boost, Microsoft’s system for offloading virtualization. The Mv3 family will scale for SAP workloads, ranging from less than 1 TB to 32 TB.
The Mv3 platform offers improved resilience against common failures in memory, storage and networking, resulting in minimal interruptions to mission-critical workloads. Mv3 delivers up to 30 percent faster SAP HANA ( Hi gh-performance AN alytic A ppliance) data load times for SAP OLAP (Online Analytical Processing) workloads compared to the previous generation Mv2, and up to 15 percent higher performance per core for SAP OLTP (Online Transaction Processing) workloads over Mv2.
Powered by Azure Boost, Mv3 delivers up to two times throughput to Azure premium solid-state drive (SSD) disk storage and up to 25 percent improvement in network throughput over Mv2. Azure Boost is a new system that offloads virtualization processes traditionally performed by the hypervisor and host OS, such as networking, storage and host management, onto purpose-built hardware and software. Azure Boost achieves several benefits for Mv3 VMs, including enhanced network and storage performance at scale, improved security through an additional layer of logical isolation and reduced maintenance impact during future Azure software and hardware upgrades.
Updates to the Mv3 family include:
- The Mv3 medium memory offering, with VM sizes up to 4 TB of memory, is now generally available.
- The Mv3 very high memory offering, with 32 TB of memory, is now in preview.
- See also: 1.5.1. AI infrastructure updates for more info on Azure Boost
- Blog: Announcing public preview of new Mv3 Medium Memory Virtual Machines
- Breakout: How Microsoft IT uses Azure to run a modern SAP environment
1.3. Azure Confidential Computing
1.3.1. confidential containers on azure kubernetes service in preview.
Confidential containers on Azure Kubernetes Service (AKS) is the first cloud service offering pod-level isolation and memory encryption in a managed Kubernetes service based on the open-source Kata containers project and powered by AMD SEV-SNP. Organizations will be able to migrate their most sensitive container workloads to the cloud without any code changes, while protecting their data in memory from external and internal threats. Confidential containers on AKS is now in preview.
- Blog: Learn more about this update .
- Product Roundtable: Using confidential VMs and containers to enable new use cases
1.3.2. Microsoft Azure Managed Confidential Consortium Framework in preview
Microsoft Azure Managed Confidential Consortium Framework , now in preview, is a new Azure service that will offer execution of the Microsoft Confidential Consortium Framework (CCF) open-source SDKs as a managed service, eliminating the need for developers to stand up their own infrastructure to support a CCF API endpoint. Developers will be able to more easily build and manage confidential multi-party applications with decentralized trust on a secured and governed network of trusted execution environments.
- Keynote: Cloud Native Innovations with Mark Russinovich
1.3.3. New confidential virtual machine option for Azure Databricks
The confidential virtual machine (VM) option for Azure Databricks is now generally available.
Customers seeking to better ensure privacy of personally identifiable information (PII) or other sensitive data while analyzing that data in Azure Databricks can now do so by specifying AMD-based confidential VMs when creating an Azure Databricks cluster. Running a customer’s Azure Databricks cluster on Azure confidential VMs enables Azure Databricks customers to confidently analyze their sensitive data in Azure.
1.3.4. New confidential virtual machines with Intel processors in preview
The DCesv5-series and ECesv5-series confidential virtual machines (VMs) are now in preview. Featuring 4th Gen Intel® Xeon® Scalable processors, these VMs are backed by an all-new hardware-based trusted execution environment called Intel® Trust Domain Extensions (TDX). Organizations will be able to use these VMs to seamlessly bring confidential workloads to the cloud without any code changes to their applications.
1.3.5. New features and services for Azure confidential virtual machines
New features and services for Azure confidential virtual machines (VMs) include Red Hat Enterprise Linux (RHEL) 9.3 support, Disk Integrity Tool, temporary disk encryption, new region support and trusted launch as default in PowerShell for all Azure Gen 2 VMs.
RHEL 9.3 support for AMD SEV-SNP confidential VMs will allow Azure customers to specify the RHEL 9.3 image as the guest operating system (OS) for AMD-based confidential VMs. This will ensure any sensitive data processed by their RHEL guest OS is protected in use, in memory. Azure AMD-based confidential VMs provide a strong, hardware-enforced boundary that hardens the protection of the guest OS against host operator access and other Azure tenants. These VMs are designed to help ensure that data in use, in memory, is protected from unauthorized users using encryption keys generated by the underlying chipset and inaccessible to Azure operators. RHEL 9.3 support for AMD SEV-SNP confidential VMs is in preview.
Disk Integrity Tool for Intel TDX confidential VMs will allow customers to measure and attest to a disk in their confidential VM. The tooling comes as an Azure CLI extension that a user can install in their own trusted environment to run a few simple commands to protect the disk. When such integrity protected disks are used for confidential VM deployments, after the VM boots, users will be able to cryptographically attest that OS disk’s root/system partition contents are secure and as expected before processing any confidential workloads. Disk Integrity Tool for AMD SEV-SNP confidential VMs is in preview.
Temporary disk encryption for AMD SEV-SNP confidential VMs will allow Azure customers to encrypt the temporary disk attached to their AMD-based confidential VMs using customer-managed keys. This ensures any sensitive data on those disks is protected at rest. Temporary disk encryption for AMD SEV-SNP confidential VMs is in preview.
New region support for AMD SEV-SNP confidential VMs is now generally available in the following new regions: Southeast Asia, Central India, East Asia, Italy North, Switzerland North, Japan East, Germany West Central and UAE North.
Trusted launch as default in PowerShell for all Azure Gen 2 VMs, now generally available, hardens Azure Virtual Machines with security features that allow administrators to deploy virtual machines with verified and signed bootloaders, OS kernels and a boot policy. This is accomplished via such trusted launch features as secure boot, vTPM and boot integrity monitoring that protect against boot kits, rootkits and kernel-level malware.
1.3.6. New NCCv5 series confidential virtual machines with NVIDIA H100 GPUs in preview
The NCCv5 series confidential virtual machines with NVIDIA H100 Tensor Core GPUs, in preview, will be the first and only cloud offering of its kind that will allow AI developers to deploy their GPU-powered applications confidentially. This will ensure data in both CPU and GPU memory is always encrypted by using keys generated by hardware and is protected from unauthorized alteration. Data scientists needing to train their models and gain insights from multiple third-party data sources will be able do so while ensuring personal data and AI models are kept private and provide evidence of their confidentiality through attestation reports.
- Keynote: Inside Microsoft AI innovations with Mark Russinovich
- Product Roundtable: Use cases for confidential ML including those using Azure Confidential VMs with NVIDIA H100 GPUs
1.4. Azure Data
1.4.1. amazon s3 shortcuts now generally available.
Amazon S3 shortcuts, now generally available, allow organizations to unify their data in Amazon S3 with their data in OneLake. With this update, data engineers can create a single virtualized data lake for their entire organization across Amazon S3 buckets and OneLake – without the latency of copying data from S3 and without changing overall data ownership.
Data lakes in S3 buckets can continue to exist and be managed externally to Microsoft Fabric. Data is mapped to the same unified namespace and can be accessed using the same Azure Data Lake Storage Gen2 APIs even when the data is coming from S3. Fabric experiences and analytical engines can directly connect to virtualized S3 data in OneLake.
This enables organizations to accelerate the overall value of their data estate with Fabric across clouds, empowering them to leverage generative AI capabilities like Copilot in Microsoft Fabric or build tailor-made large language models grounded on their data with Azure AI Studio.
- Blog: Learn more about this update.
- Breakout: Unify your data across domains, clouds and engines in OneLake
1.4.2. Azure Data Lake Storage Gen2 shortcuts now available
Azure Data Lake Storage Gen2 (ADLS Gen2) shortcuts are now generally available, empowering data engineers to connect to data from external data lakes in ADLS Gen2 into OneLake through a live connection with target data.
With this update, data from ADLS Gen2 can be reused without copying it, eliminating data duplication and lowering integration cost across an enterprise. By creating an ADLS Gen2 shortcut, data is made ready for consumption through custom large language models (LLMs) or Power BI visuals. ADLS Gen2 shortcuts also accelerate the overall value of the data estate by enabling interoperability with Azure Databricks.
Through an ADLS Gen2 shortcut, customers can now receive fast performance through Power BI Direct Lake Mode with Azure Databricks. Since OneLake uses the same APIs as ADLS Gen2 and supports the same Delta Parquet format for data storage, Azure Databricks notebooks can be easily updated to use the OneLake endpoints for data stored in OneLake.
- Breakout: Make your data AI ready with Microsoft Fabric and Azure Databricks
1.4.3. Azure SQL updates offer better cost optimization, deeper integration
Several new features and updates for Azure SQL will make the offering more cost-efficient, reliable and secure. These updates include:
Lower pricing for Azure SQL Database Hyperscale compute
New pricing on Azure SQL Database Hyperscale offers cloud-native workloads the performance and security of Azure SQL at the price of commercial open-source databases. Hyperscale customers can save up to 35 percent on the compute resources they need to build scalable, AI-ready cloud applications of any size and I/O requirement. The new pricing will be generally available in mid-December 2023.
Azure SQL Managed Instance free trial offer , in preview soon, will allow customers to discover, use and explore Azure SQL Managed Instance free of charge for 12 months. Customers will be able to run proof of concepts, test applications or simply learn more about the operational benefits of a fully managed database-as-a-service.
The free trial offer will provide substantial compute and storage to test applications, including:
- 4 vCores or 8 vCores of compute, limited to 720 vCore hours per month.
- 64 GB of storage, plus 64 GB of backup storage.
- Start and stop instance compute resources on demand.
Azure SQL Managed Instance feature wave , now generally available, introduces a bundle of features that work together to make SQL Managed Instance even more performant, reliable and secure, while enabling even deeper integration with on-premises SQL Server and the broader Azure service platform. Features included in the bundle:
- Instance start/stop , which allows customers to start and stop their instance at their discretion to save on billing costs for vCores and SQL licensing.
- Zone redundancy , which lets customers deploy their managed instance across multiple availability zones and improve the availability of their service.
- Distributed Transaction Coordinator , which lets customers run transactions across multiple database types while keeping the databases in sync.
- Blog: Power what’s next with limitless relational databases from Azure
- Breakout: Get superior price and performance with Azure cloud-scale databases
- Demo: Migrate to innovate: Modernize your data on Azure SQL Managed Instance
- Demo: Scale up your SQL workloads with Azure SQL Database
1.4.4. Microsoft 365 data in Fabric with native OneLake integration
Microsoft 365 data is now able to natively integrate to OneLake in the Delta Parquet format, the optimal format for data analysis. Microsoft 365 data was previously offered only in JSON format. With this new integration, Microsoft 365 data will be seamlessly joined with other data sources in OneLake, enabling access to a suite of analytical experiences for organizations to transform and gain insight from their data. This also means that AI capabilities built using Microsoft Fabric notebooks will now directly access Microsoft 365 data within OneLake. This update is in preview.
1.4.5. Microsoft Fabric now generally available
Microsoft Fabric, an integrated and simplified experience for a data estate on an enterprise-grade data foundation, is now generally available. Fabric enables persistent data governance and a single capacity pricing model that scales with growth, and it’s open at every layer with no proprietary lock-ins. Fabric integrates Power BI, Data Factory and the next generation of Synapse to offer customers a price-effective and easy-to-manage modern analytics solution for the era of AI.
Fabric is for the entire enterprise, complete with role-tailored tools and deep integrations with Microsoft 365, Teams and AI copilots to accelerate analytics capabilities and help scale data value creation for everyone from data professionals to non-technical business users.
1.4.6. Microsoft Fabric, now part of Microsoft Intelligent Data Platform, empowers ISVs
Microsoft Intelligent Data Platform (MIDP) is a set of tightly integrated data services that includes Microsoft Fabric, a unified analytics service that is now generally available. The intelligent data platform empowers organizations to invest more time creating value rather than integrating and managing their data estate.
Built as an easy-to-use software as a service (SaaS), Fabric is open and extensible, providing a rich set of capabilities for independent software vendors (ISVs) to further enrich the platform with industry leading applications. At Ignite, Microsoft will showcase how industry leading partners like London Stock Exchange, Esri, Informatica, Teradata and SAS are bringing their product experiences as workloads into Fabric. This will help partners widen their reach, and expand the breadth of capabilities that our mutual customers can access seamlessly with Microsoft Fabric.
- Breakout: Build ISV apps with Microsoft Fabric in the Intelligent Data Platform
1.4.7. New features in Azure Cosmos DB increase developer productivity, cost efficiency
Several new features in Azure Cosmos DB will help developers deliver apps in a more efficient manner while also reducing production cost. These updates include:
Dynamic scaling per partition/region, now in preview for new Azure Cosmos DB accounts, will allow customers to optimize for scale and cost in situations where partitioning is used to scale individual containers in a database to meet the performance needs of applications, or where multi-region configuration of Azure Cosmos DB is used for global distribution of data.
Dynamic scaling provides developers with added flexibility to save costs by scaling up and down their database needs on a more granular level, either by region or by partition of their data. This is cost-friendly for customers who run into hot partitions in their databases or have operations around the globe.
Microsoft Copilot for Azure integration in Azure Cosmos DB, now in preview, will bring AI into the Azure Cosmos DB developer experience. Specifically, this release enables developers to turn natural language questions into Azure Cosmos DB NoSQL queries in the query editor of Azure Cosmos DB Data Explorer. This new feature will increase developer productivity by generating queries and written explanations of the query operations as they ask questions about their data.
Azure Cosmos DB for MongoDB vCore , now generally available, allows developers to build intelligent applications in Azure with MongoDB compatibility. With Azure Cosmos DB for MongoDB vCore, developers can enjoy the benefits of native Azure integrations, low total cost of ownership and the familiar vCore architecture when migrating existing applications or building new ones. Azure Cosmos DB for MongoDB vCore is also introducing a free tier, which is a developer-friendly way to explore the platform’s capabilities without any cost. Learn more about the free tier .
In addition, a new Azure AI Advantage offer will help customers realize the value of Azure Cosmos DB and Azure AI together. Benefits include:
- Savings up to 40,000 RU/s for three months on Azure Cosmos DB when using GitHub Copilot or Azure AI, including Azure OpenAI Service.
- World-class infrastructure and security to grow business and safeguard data.
- Enhanced reliability of generative AI applications by leveraging the speed of Azure Cosmos DB to retrieve and process data.
Vector search in Azure Cosmos DB MongoDB vCore , now generally available, allows developers to seamlessly integrate their AI-based applications with the data stored in Azure Cosmos DB. Vector search enables users to efficiently store, index and query high-dimensional vector data, eliminating the need to transfer the data to more-expensive alternatives for vector search capabilities, such as vector databases.
- Blog: Learn more about dynamic scaling.
- Blog: Learn more about Microsoft Copilot for Azure integration in Azure Cosmos DB..
- Breakout: Data for the era of AI: build intelligent apps with Azure Cosmos DB
- Demo: Optimizing workloads with Azure Cosmos DB’s newest Autoscale features
- Demo: Build AI-powered apps with Azure Cosmos DB for MongoDB vector search
- Discussion: Creating your own ChatGPT experience with enterprise data: Q&A
1.4.8. New manageability and security features for SQL Server enabled by Azure Arc
Enhancements to SQL Server enabled by Azure Arc offer additional management capabilities to SQL Server running outside Azure, including monitoring, high availability/disaster recovery (HA/DR) management and Extended Security Updates.
Monitoring for SQL Server enabled by Azure Arc , now in preview, will allow customers to gain critical insights into their entire SQL Server estate across on-premises datacenter and cloud, optimize for database performance and diagnose problems faster. With this monitoring tool, customers will be empowered to switch from a reactive operation mode to a proactive one, further improving database uptime while reducing routine workloads.
Enhanced high availability and disaster recovery (HA/DR) management for SQL Server enabled by Azure Arc is now in preview. With Azure Arc, customers can now improve SQL Server business continuity and consistency by viewing and managing Always On availability groups, failover cluster instances and backups directly from the Azure portal. This new capability will provide customers with better visibility and a much easier and more flexible way to configure critical database operations.
Extended Security Updates for SQL Server enabled by Azure Arc is now generally available. Extended Security Updates for SQL Server, which provide critical security updates for up to three years after the end of extended support, are now available as a service through Azure Arc. With the Extended Security Update service, customers running older SQL Server versions on-premises or in multicloud environments can manage security patches from the Azure portal. Extended Security Updates enabled by Azure Arc give financial flexibility with a pay-as-you-go subscription model.
- Breakout: Do More with Windows Server and SQL Server on Azure
- Breakout: Design cloud to edge architecture patterns with Azure Arc
- Demo: Bring automated data manageability to SQL Server anywhere
- Discussion: Bring cloud-native automated data manageability to SQL Server: Q&A
1.4.9. New performance enhancements in Azure Database for MySQL Business Critical
New performance enhancements in Azure Database for MySQL Business Critical service tier make it ideal for high-performance transactional or analytical applications. With the preview of Accelerated Logs, organizations may see an out-of-the-box improvement in performance of up to two times or more at no additional cost, based on internal testing by Microsoft.
A recent performance benchmark study by Principled Technologies shows that Azure Database for MySQL Business Critical service tier is up to 50 percent faster than MySQL on Amazon Web Services (AWS) Relational Data Service and up to 2.6 times faster than Google Cloud Platform (GCP) Cloud SQL for MySQL. These enhancements help make Azure Database for MySQL Business Critical ideal for mission-critical, Tier 1 MySQL workloads.
- Demo: Explore new business-critical features of Azure Database for MySQL
1.4.10. Performance enhancements and new AI capabilities for Azure Database for PostgreSQL
Azure Database for PostgreSQL is a database service built on Microsoft’s scalable cloud infrastructure for application developers. Updates to the service include:
Enhanced performance and scalability for Azure Database for PostgreSQL. This update provides advanced storage and compute capabilities that enable optimal price-performance for enterprise production workloads. Features include:
- Premium SSD v2 , in preview, will offer sub-millisecond disk latencies plus up to 64 TB storage and 80K input/output operations (IOPS) for demanding IO-intensive workloads at a low cost, providing great flexibility for managing performance and cost for Tier-1 production environments.
- IOPS scaling , in preview, will enable customers to scale up IOPS up to 20K to perform transient operations such as migrations or data loads more quickly, and then scale it back down when not required, to save cost.
- Online, dynamic compute and storage scaling , now generally available, adjusts the amount of compute and storage resources based on current demand via a seamless experience with near-zero downtime.
Azure Database for PostgreSQL extension for Azure AI will allow developers to leverage large language models (LLMs) and build rich PostgreSQL generative AI applications, meaning PostgreSQL queries on Azure can now power Azure AI applications. Now in preview, the extension will enable:
- Calling into Azure OpenAI Service to generate LLM-based vector embeddings that allow efficient similarity searches, which is particularly powerful for recommendation systems.
- Calling into Azure AI Language for a wide range of scenarios such as sentiment analysis, language detection, entity recognition and more.
- Demo: Best performance and scalability with PostgreSQL in Azure
- Demo: Build AI solutions with Azure Database for PostgreSQL
1.5. Azure Infrastructure
1.5.1. ai infrastructure updates.
Azure is the world’s computer, powering a range of solutions from cloud services to running the most sophisticated AI models. With insights from workloads and customer requirements, Microsoft is optimizing and innovating across every layer of the hardware and software stack.
Microsoft’s ecosystem approach includes longstanding partnerships with industry leaders to provide customers with choice in performance, efficiency and cost for AI inferencing, training and general compute.
Azure infrastructure is adding choice in price and performance across the Azure infrastructure technology stack, from the datacenter and its racks to servers and the silicon that powers them, including:
Custom-built silicon for AI and enterprise workloads in the Microsoft Cloud Today, Microsoft is announcing new custom silicon that complements Microsoft’s offerings with industry partners. The two new chips, Microsoft Azure Maia and Microsoft Azure Cobalt, were built with a holistic view of hardware and software systems to optimize performance and price.
Microsoft Azure Maia is an AI Accelerator chip designed to run cloud-based training and inferencing for AI workloads, such as OpenAI models, Bing, GitHub Copilot and ChatGPT.
Microsoft Azure Cobalt is a cloud-native chip based on Arm architecture optimized for performance, power efficiency and cost-effectiveness for general purpose workloads.
Azure Boost is now generally available One of Microsoft Azure’s latest and most significant infrastructure improvements, Azure Boost, is now generally available. Azure Boost enables greater network and storage performance at scale, improves security, and reduces servicing impact by moving virtualization processes traditionally performed by the host servers, such as networking, storage and host management, onto purpose-built hardware and software optimized for these processes. This innovation allows Microsoft to achieve the fastest remote and local storage performances in the market today, with a remote storage performance of 12.5 Gbps (gigabits per second) throughput and 650K IOPS (input/output operations per second) and a local storage performance of 17.3 Gbps throughput and 3.8M IOPS.
N D MI300 v5 virtual machines with AMD chips optimized for generative AI workloads The ND MI300 v5 virtual machines are designed to accelerate the processing of AI workloads for high range AI model training and generative inferencing, and will feature AMD’s latest GPU, the AMD Instinct MI300X.
NC H100 v5 virtual machines with latest NVIDIA GPUs The new NC H100 v5 Virtual Machine (VM) Series, in preview, is built on the latest NVL variant of the NVIDIA Hopper 100 (H100), which will offer greater memory per GPU. The new VM series will provide customers with greater performance, reliability and efficiency for mid-range AI training and generative AI inferencing. By maintaining more memory per GPU in the VM, customers increase data processing efficiency and enhance overall workload performance.
- Blog: Learn more on Microsoft Source.
- Blog: Learn more about Azure Boost.
- Breakout: Unlock AI innovation with Azure AI Infrastructure
1.5.2. Azure Monitor and Azure Migrate updates
Azure Monitor System Center Operations Manager (SCOM) Managed Instance brings SCOM monitoring capabilities and configurable health models to Azure Monitor. A capability within Azure Monitor, SCOM Managed Instance provides a cloud-based alternative for SCOM customers, providing monitoring continuity for cloud and on-premises environments across the cloud adoption journey.
SCOM Managed Instance is now generally available. Since preview, SCOM Managed Instance has added multiple capabilities such as the integration of SCOM alerts with that of Azure Monitor, the ability to send integrated alerts to IT service management tools, the ability to view service health from the Azure portal and an enhanced onboarding experience.
Azure Migrate , the service used to migrate to and modernize in Azure, is introducing discovery, business case analysis and assessment support for new workloads. This allows customers to analyze their configuration and compatibility for new use cases so they can determine appropriately sized Azure instances at optimal cost and without blockers.
Specific features, in preview, include Spring apps assessment, business case with management costs, business case and assessment with security and Windows and SQL ESU in business case and Web apps assessment, which is generally available.
- Blog: Learn more about updates to Azure Monitor System Center Operations Manager (SCOM) Managed Instance .
- Blog: Learn more about new Azure Migrate updates .
- Breakout: Migrate to Innovate: Be AI-ready, secure and optimize operations
- Demo: Next-gen monitoring with Azure Monitor
1.5.3. Introducing Azure IoT Operations
Azure IoT Operations is a new addition to the Azure IoT portfolio that will offer a unified, end-to-end Microsoft solution that digitally transforms physical operations seamlessly from the cloud to the edge.
This offering, now in preview, will feature a “One Microsoft” approach from cloud to edge to digitally transform physical operations. Microsoft is standardizing cloud-to-edge architecture for digital solutions in physical operations with industry standards and open-source approaches.
This unifying approach for customers’ digital ecosystems will remove technical hurdles for the next level of digital transformation, enable technical collaboration across IT and operational technology and bring interoperability and scalability to digital solutions.
That unified approach consists of the following:
- Management plane: One control plane to secure and govern assets and workloads across cloud to edge with Azure Arc.
- Application development: Consistently build and deploy apps anywhere, in the cloud or at the edge.
- Cloud-to-edge data plane: Seamless integration at the data level from asset to cloud and back again.
- Common infrastructure: Customers can connect investments in the cloud with their on-premises resources.
- Breakout: The AI-era of Industrial Transformation
- Discussion: Evolving IoT for physical operations
1.5.4. Microsoft and Oracle announce general availability of Oracle Database@Azure
Microsoft and Oracle announce the general availability of Oracle Database@Azure , which will become available in the US East Azure region starting in December 2023, with expansions planned in the additional regions in the first quarter of 2024 and beyond. Customers will have direct access to Oracle database services running on Oracle Cloud Infrastructure (OCI) deployed in Microsoft Azure datacenters, starting with the Oracle Exadata Database Service, combined with the security, flexibility and best-in-class services of Microsoft Azure. Microsoft is the only other hyperscaler to offer OCI Database Services to simplify cloud migration, multicloud deployment and management.
- Microsite: Learn more about this update .
- Demo: OracleDatabase@Azure Demo
- Discussion: Multicloud with OCI and Azure – Oracle Database@Azure
- Breakout: Migrate Oracle workloads to Azure
1.5.5. Updates across Azure Arc
Azure Arc simplifies governance and management by delivering a consistent multicloud and on-premises management platform to help organizations control and govern their environments. Updates to Azure Arc features and infrastructure capabilities include:
VMware vSphere enabled by Azure Arc , now generally available, will help users simplify management of hybrid IT estate distributed across VMware vSphere and Azure. It does so with the enablement of Azure Arc, which extends the Azure control plane to VMware vSphere infrastructure and enables the use of Azure security, governance and management capabilities consistently across VMware vSphere and Azure.
Customers can start with connecting Azure Arc to the resources in the VMware vSphere deployments, install agents at scale and enable Azure management, observability and security solutions, while benefitting from the existing lifecycle management capabilities. This feature is now generally available.
The latest Azure Stack HCI feature update , in preview, will bring innovative capabilities that continue to simplify the day-to-day life of IT pros by making the deployment and management of Azure Stack HCI simpler and more automated when coupled with the newly released turnkey solutions. For example, a new cluster deployment capability will automatically provision virtual machines (VMs) and the ability to manage Azure Stack HCI updates from the Azure portal at scale. Additionally, this feature update also expands VM extensions support that include Microsoft Defender for Cloud, Azure Monitor and Azure Update Manager.
Other updates, in preview, include:
- Site manager , a new feature as part of the Azure Stack HCI update that will help customers organize all Arc resources per location, which saves time.
- A new tool that will leverage Azure Migrate to enable customers to migrate their Hyper-V virtual machines directly to Azure Stack HCI nodes . This simplifies customer migration to a new infrastructure, which is especially useful during their hardware refresh cycle.
In addition to Azure Stack HCI news, AKS on VMware , in preview, will give customers who have used Azure Kubernetes Service (AKS) in the cloud or AKS on-premises on Windows Server or Azure Stack HCI the same experience for their VMware environment. With this new member of the AKS family, Microsoft will have a holistic suite of Kubernetes offerings for customers in the cloud and on-premises.
System Center Virtual Machine Manager (SCVMM) self-service capabilities are now generally available in Azure with Azure Arc. Once connected with Azure Arc, customers can manage and control their System Center Virtual Machine Manager (VMM) environments on Azure and perform VM self-service operations from the Azure portal. Customers get a consistent management experience across Azure for the cloud and hybrid environments. For Azure Pack customers, this solution is intended as an alternative to performing VM self-service operations.
- Blog: Bring Azure to your VMware environment: Announcing GA of VMware vSphere enabled by Azure Arc
- Blog: Accelerate edge deployments with cloud-managed infrastructure and Azure Stack HCI version 23H2
- Learn: What’s new in System Center Virtual Machine Manager
1.5.6. Updates to Azure Storage services
Several new features and performance enhancements for storage on Microsoft Azure are designed to simplify data management, enhance performance and facilitate a smoother migration to the cloud. These updates include:
Azure Ultra Disk Storage: The maximum provisioned input/output operations (IOPS) per second and provisioned throughput on Azure Ultra Disk Storage is increased to 400,000 IOPS and 10,000 MB/s per disk. A single Ultra Disk can achieve the maximum IOPS and throughput of the largest Azure virtual machines, reducing the complexity of managing multiple disks striped together. The increased performance can also be leveraged by multiple Azure Virtual Machines when the Ultra Disk is configured as a shared disk. This update is generally available.
Azure Storage Mover: Azure Storage Mover is a Microsoft service that enables Azure storage customers to migrate their on-premises file shares to Azure file shares and Azure Blob Storage. Updates include:
- The Server Message Block (SMB) share to Azure file share migration path is now generally available.
- A Storage Mover agent image for VMware is now generally available.
- Storage accounts with the blob storage Hierarchical Namespace Service feature are now supported in preview.
New file system performance and economics with Azure Native Qumulo Scalable File Service: Azure Native Qumulo (ANQ) V2 Scalable File Service transforms the cloud file service by combining the performance and elasticity of the cloud with enterprise features and universal compatibility of on-premises file systems.
New and enhanced capabilities, generally available, include:
- Unbeatable economics: ANQ offers pay-as-you-go pricing, enabling reductions in cost as data volume grows.
- Cloud scale: ANQ separates performance from capacity, allowing throughput and capacity to scale independently, elastically and seamlessly.
- Cloud simplicity: ANQ’s rapid deployment gets customers running in just 12 minutes directly in the Azure portal. With ANQ V2’s global namespaces, all workloads can be pointed to a single namespace regardless of the data residing on ANQ or in on-premises Qumulo environments.
- Blog: Learn more about Azure Native Qumulo Scalable File Service.
- Blog: Learn more about Azure Storage updates .
- Product roundtable: Streamline storage data management with Azure Storage Actions
- Discussion: Ask our migration experts about migrating data to Azure Storage
1.6. Azure Management & Operations
1.6.1. azure business continuity center helps manage, protect and govern resources at scale, now in preview.
Azure Business Continuity Center is a tool that will give IT admins comprehensive, resilient protection capabilities to address complex security requirements and rapidly evolving threats. The new Azure Business Continuity Center, now in preview, will provide IT admins with these benefits:
- Simplified management: A one-stop solution for ensuring business continuity and disaster recovery (BCDR) by providing customers with the ability to manage solutions across first-party Azure services, such as Azure Backup and Azure Site Recovery, and eventually third-party solutions.
- Rich insights : Users can view a security summary of the entire BCDR estate and receive guidance on individual security properties with actionable insights to improve security posture.
- Protection across the continuum: Helps ensure efficient ransomware protection and mitigation for the infrastructure, data and application layers with simplified monitoring to validate current protection state and configuration drift.
1.6.2. Azure Chaos Studio now generally available
Azure Chaos Studio , now generally available, provides a fully managed experimentation platform for discovering challenging issues through experiment templates, dynamic targets and a more guided user interface.
Chaos Studio offers customers the opportunity to intentionally disrupt their applications to uncover reliability issues and strategize for issue prevention before they impact users. For example, Chaos Studio enables users to assess how applications respond to real-world disruptions like network delays, unexpected storage failures, expired secrets or even complete datacenter outages. Templates in Chaos Studio allow customers, in a matter of minutes, to test the resilience of their Azure resources by providing a set of pre-filled experiments based on common outage scenarios. Dynamic targets functionality allows users to select experiment targets by using Keyword Query Language (KQL) queries, rather than static list selection, allowing more sophisticated fault injection scenarios to be created.
Customers acquire the ability to boost their resilience against faults and failures by gaining a better understanding of application resiliency, conducting experiments with a wide variety of agent- and service-based faults and maintaining production quality through continuous validation.
- Blog: Announcing General Availability: Microsoft Azure Chaos Studio
- Discussion: Optimize app reliability with automated Load and Chaos testing, Q&A
2. Developer 2.1. Developer Community
2.1.1. new ai microsoft applied skills credentials now available.
Microsoft is releasing new Microsoft Applied Skills credentials critical to AI transformation to help users:
- Develop generative AI with Azure OpenAI Service.
- Create an intelligent document processing solution with Azure AI Document Intelligence.
- Build a natural language processing solution with Azure AI Language.
- Build an Azure AI Vision solution.
Microsoft previously announced Microsoft Applied Skills, a new credential that allows individuals to prove they have specific skills needed to implement projects aligned to business scenarios.
Including the new set available at Microsoft Ignite, 15 Microsoft Applied Skills credentials have been released and with at least five more coming by the end of December 2023. The new credentials align to projects, like developing generative AI solutions or configuring secure access using Azure networking, which are key as organizations adopt cloud and AI technologies.
Microsoft Certifications and Microsoft Applied Skills are complementary and verified by Microsoft, offering a signal of trust to organizations and helping them efficiently pinpoint talent with the technical skills they need to implement highly technical solutions and take on projects critical to organizational success.
The process to earn a Microsoft Applied Skills credential is designed to be straightforward and flexible and includes:
- Optional training, including free self-paced learning paths available on Microsoft Learn, with instructor-led training coming soon.
- Passing an interactive, lab-based assessment that takes candidates through a series of scenario-based tasks in products like Microsoft Azure or Power Platform. The lab assessment is accessible on-demand directly from Microsoft Learn.
- Receiving a credential, verified by Microsoft, which can be easily shared to LinkedIn profiles, providing the opportunity to showcase new skills on a professional network.
- Discussion: Boost your profile with the new Microsoft Applied Skills credential
- Demo: Navigating Microsoft Credentials
2.2. Developer Tools & DevOps
2.2.1. azure migrate application and code assessment now generally available.
The Azure Migrate application and code assessment , now generally available, complements the Azure Migrate assessment and migration tool to help modernize and re-platform large-scale .NET and Java applications through detailed code and application scanning and dependencies detections. The tool offers a comprehensive report with recommended code changes for customers to apply a broad range of code transformations with different use cases and code patterns.
- Breakout: Unlock innovation with AI by migrating enterprise apps to App Service
- Discussion: Easy app migration with production grade landing zones & app patterns
- Demo: Move to the cloud faster with Azure Migrate app and code assessmen
2.2.2. Azure Container Apps makes it easier to deploy apps, run AI workloads
Azure Container Apps is adding new features to make it easier to deploy code to the cloud and run AI workloads.
- Dedicated GPU workload profiles: Users will be able to run machine learning models with Azure Container Apps as a target compute platform to build event driven intelligent applications to train models or derive data-driven insights. This feature is in preview.
- Azure Container Apps landing zone accelerator: Simplifies building of a production-grade secured infrastructure at an enterprise scale to deploy fully managed, cloud-native apps and microservices. This feature is generally available.
- Azure Container Apps code to cloud: Users will be able to focus on code and quickly take an application from source to cloud without the need to understand containers or how to package application code for deployment. This feature is in preview.
- Vector database add-ins: Three of the most popular open-source vector database variants, Qdrant, Milvus and Weaviate, are now available in preview as add-ins for developers to get started in a fast and affordable way.
- Blog: Build Intelligent Apps and Microservices with Azure Container Apps
- Breakout: AI and Kubernetes: A winning combination for Modern App Development
- Discussion: All about Intelligent apps: What are they and how to build them Q&A
- Demo: Move to the cloud faster with Azure Migrate app and code assessment
2.2.3. Azure Deployment Environments adds new capabilities
Azure Deployment Environments is adding new capabilities to increase Azure service integration and expand configuration options. These features, now in preview, will make it easier to set up and delete application infrastructure with the Azure Developer CLI tool and the schedule auto-expiry feature and include:
- Azure Developer CLI (azd) integration: Enterprise developers will be able to leverage azd to provision application infrastructure using Azure Deployment Environments and more easily deploy code onto their provisioned infrastructure.
- Schedule auto-expiry: Development teams will be able to configure environments to auto-expire so that the resources are deleted once an environment is no longer needed.
- Breakout: Develop in the cloud with Dev Box and Azure Deployment Environments
2.2.4. Azure Functions announces support for .NET 8, new hosting plan
Azure Functions now supports .NET 8 for applications using the isolated worker model. Support is now available for Windows and Linux on the consumption, elastic premium and application service plan hosting options. This update is generally available.
Flex Consumption Plan is a new Azure Functions hosting plan that will build on the consumption, pay-for-what’s-used, serverless billing model. It will provide more flexibility and customizability without compromising on available features. New capabilities will include fast and large elastic scale, instance size selection, private networking, availability zones and high concurrency control. Users can request access to the private preview .
2.2.5. Azure Kubernetes Service offers new capabilities for AI and machine learning workloads
Customers can now run specialized machine learning workloads like large language models (LLMs) on Azure Kubernetes Service (AKS) more cost effectively and with less manual configuration. The release of Kubernetes AI toolchain operator automates LLM model deployment on AKS across available CPU and GPU resources by selecting optimally sized infrastructure for the model. It makes it possible to easily split inferencing across multiple lower-GPU-count VMs, increasing the number of Azure regions where workloads can run, eliminating wait times for higher-GPU-count VMs and lowering overall cost. Customers can also choose from preset models with images hosted by AKS, significantly reducing overall inference service setup time.
Additionally, Azure Kubernetes Fleet Manager enables multi-cluster and at-scale scenarios for AKS clusters. Platform admins who are managing Kubernetes fleets with many clusters often face challenges staging their updates in a safe and predictable way. This allows admins to orchestrate updates across multiple clusters by using update runs, stages and groups. This is now generally available.
- Breakout: Master Platform Engineering: Architecting Scalable & Resilient Systems
- Demo: Dev-centric automation, AI-Assisted Ops and Cost Optimization in AKS
2.2.6. Azure Native Services update to optimize performance and scale
Azure Native ISV Services enables organizations to access and utilize specialized software and services on Microsoft Azure. These services offer features tailored for cloud performance, seamless integration and operational efficiency between independent software vendor (ISV) software and services native on Azure.
Apache Airflow™ on Astro – an Azure Native ISV Service now in preview – will enable organizations to place Airflow at the core of their data operations, providing ease of use, scalability and enterprise-grade security to help ensure the reliable delivery of mission-critical data pipelines. With the Azure Native ISV Services integration, Astro will be easily available within the Azure portal as a managed service. Instead of managing complex data pipelines, developers will be able to focus on data, code, security and billing across third-party entities. Developers may opt for the pay-as-you-go option based on their usage with billing via the Azure Marketplace.
2.2.7. Microsoft Dev Box introducing new capabilities for customization and setup
Microsoft Dev Box is introducing new capabilities to support greater customization and simplify Dev Box setup for development teams. These new features, now in preview, focus on increasing developer productivity and satisfaction:
- Dev box limits: Developer teams will be able to directly limit the number of dev boxes each developer can create within a project to help manage costs and ensure efficient use of resources.
Additionally, Dev Boxes will connect to new Microsoft hosted networks to simplify network setup by eliminating the need to create, configure and secure a virtual network. IT admins and development teams can also use the new quick create template for a complete, step-by-step guide from initial admin configuration through Dev Box deployment.
Docker, in collaboration with Microsoft, now provides Dev Box-compatible preconfigured images on the Azure Marketplace that have everything needed to build containers with Visual Studio and Docker Desktop.
- Demo: Use customization to personalize Dev Box for you and your team
- Discussion: The ins and outs of deploying Microsoft Dev Box inside Microsoft, Q&A
2.2.8. Microsoft offering guidance to help organizations establish Platform engineering
Platform engineering is an approach that builds on DevOps best practices through tools for automation, tracking, governance and observability to accelerate modern software application delivery.
Organizations need developers to get started and deploy code quickly while also ensuring their software processes are secure, compliant and cost-controlled. Platform engineering capabilities empower developers with more self-service, automated experiences, and operations teams with more standardized, secure and efficient application development infrastructure to ultimately improve developer productivity, governance and time to business value.
Microsoft provides a core set of technology building blocks and learning modules to help organizations get started on their journey to establish a platform engineering practice.
- Breakout: Build a productive and secure AI-powered experience for dev teams
- Discussion: Platform engineering Q&A with the Microsoft platform engineering team
2.2.9. MQTT broker feature, publish-subscribe capabilities now available for Azure Event Grid
Azure Event Grid now supports additional capabilities to help customers capitalize on growing industry scenarios. A key part of this new functionality is the ability to deliver publish-subscribe messaging at scale, which enables flexible consumption patterns for data over HTTP and MQTT protocols. This capability is now generally available.
Reflecting the growing demand for connectivity, integration and analytics between Internet of Things (IoT) devices and cloud-based services, Azure Event Grid’s new MQTT broker feature enables bi-directional communication between MQTT clients at scale, enabling one-to-many, many-to-one and one-to-one messaging patterns using MQTT v3.1.1 and MQTT v5 protocols. This feature is now generally available.
These capabilities allow IoT devices from manufacturing plants, automobiles, retail stores and more to send data to – and receive data from – Azure services and third-party services. To process the data further, users can route IoT data to Azure services, such as Azure Event Hubs, Azure Functions and Azure Logic Apps. Data can also be routed to third-party services via webhooks.
Other new features include:
- Pull delivery for event-driven architectures: This allows customers to process events from highly secure environments without configuring a public end point, controlling the rate and volume of messages consumed, while supporting much larger throughput. This feature is generally available.
- Push delivery to Azure Event Hubs: Event Grid namespaces will support the ability to push events to Azure Event Hubs at high scale through a namespace topic subscription. This enables the development of more distributed applications to send discrete events to ingestion pipelines. This feature is in preview.
- Increased throughput units: To help customers scale to meet the demands of these new scenarios, Event Grid has also increased the number of throughput units available in an Event Grid namespace to 40, meeting the needs of more data-intensive scenarios by providing more capacity. This feature is generally available.
2.2.10. New AI and orchestration capabilities from Azure Communication Services
Azure AI Speech integration into Azure Communication Services Call Automation workflows, generally available in November, will enable AI-assisted experiences for customers calling into a business. By recognizing specific phrases as well as free-form sentences spoken by a customer, businesses will be able to adapt to changing customer needs and help shorten the time customers spend navigating Interactive Voice Response (IVR) menus.
Azure Communication Services job router , generally available in early December, will simplify the development of routing capabilities for inbound customer communications and steering customers to the most suitable point of contact in a business. Whether it’s an agent in a contact center with a specific skill set or an automated service designed to manage routine inquiries, job router will ensure that every customer inquiry is pointed to the most appropriate resource available.
- Pre-recorded: Transform customer experiences with AI-assisted voice, video and chat
2.2.11.New capabilities simplify app migration to Azure App Service for Linux, Windows
Azure App Service is quickly becoming the preferred cloud destination for migrating millions of .NET and Java workloads still running on-premises. New capabilities that simplify app migration for Linux and Windows include:
- Single subnet support for multiple App Service plans is now generally available. Network administrators gain substantial reduction in management overhead thanks to the new capability enabling multiple service plans to connect to a single subnet in a customer’s virtual network.
- WebJobs on Linux is now in preview. WebJobs is a popular feature of Azure App Service that enables users to run background tasks in the Azure App Service without any additional cost. Previously available on Windows, it will extend to Linux, enabling customers to run background or recurring tasks and do things like send email reports or perform image or file processing.
- Extensibility support on Linux is now in preview. Previously available on Windows, it will allow Linux web apps to take advantage of third-party software services on Azure and connect to Azure Native ISV services more easily.
- gRPC, a high-performance, open-source universal RPC framework that now provides full bi-directional streaming support and increased messaging performance over HTTP/2 for web apps running on App Service for Linux is generally available.
2.2.12. New updates for integration of applications, data and processes in Azure
Several new updates have been made throughout Azure offerings, giving users the ability to better integrate their applications, data and processes.
API Management’s Credential Manager , now generally available, simplifies the management of authentication and authorization for both professional developers and citizen developers.
Defender for APIs , a new offering as part of Microsoft Defender for Cloud – a cloud-native application protection platform (CNAPP), is now generally available. Natively integrating with Azure API Management, security admins gain visibility into the Azure business-critical APIs, understand and improve their security posture, prioritize vulnerability fixes and detect and respond to active runtime threats within minutes using machine learning-powered anomalous and suspicious API usage detections.
Azure Integration Environment , in preview, is a new Azure Service that will offer a unified and streamlined experience for Azure Integration Services, presenting users with a single-pane view of the various components of Azure Integration Services used to build the integration solution. With a unified interface, it will empower users to effortlessly orchestrate and manage diverse components of the Azure Integration Services portfolio tailored to their unique needs.
Other updates include:
- .NET Framework Custom Code Extensibility for Azure Logic Apps is generally available.
- Application Insights enhancements for Azure Logic Apps are generally available.
- Business process tracking is in preview.
- Discussion: Accelerate innovation with Azure Integration Services Q&A
- Demo: Effective API Management: A Deep Dive on OpenAI + Azure API Management
- Demo: Enhance API Data security with Defender for APIs
3. Edge 3.1. Edge
3.1.1. shared links in edge will provide inbox for links.
Shared links in Microsoft Edge for Business will bring links that have been shared in Microsoft Outlook and Teams to helpful locations, such as the address bar, new tab page and the Edge sidebar. The experience will also be available in the Edge mobile app. Instead of having to hunt through emails and chats or ask a colleague for a link again, users will be able to easily find and open these links within the Edge browser.
Users will easily see recent links shared with them from Outlook and Teams by typing in the address bar or opening a new tab. They can also open a new Shared links pane in the Edge sidebar or in the Edge mobile app to see even more links. From there, they can filter by person, date, link type or the app it came from. Edge will be like an inbox to find and open shared links –providing a quick reference home for links that colleagues have shared.
Shared links in Microsoft Edge for Business is available as prepopulated results in the Edge address bar, is in preview in the Edge new tab page and in Edge mobile and will be coming soon to the sidebar.
4. Microsoft 365 4.1. Microsoft 365 Apps & Services
4.1.1. introducing microsoft sharepoint premium.
As the volume of content grows across every organization in this era of digital transformation and AI, Microsoft is expanding its content management portfolio beyond SharePoint, the market leader in content platforms.
SharePoint Premium , Microsoft’s new AI-powered solution to transform content management, content experiences and get content ready for Microsoft Copilot, a set of tools that help people achieve more using AI, is now in preview and will be generally available early next year. SharePoint Premium will build structure, security and governance to ground an organization’s content, so Copilot has better information to leverage. New features will include:
- Content experiences to help information workers in their flow of work, allowing them to seamlessly discover, interact and collaborate with hundreds of file types , while providing fresh content using AI analytics with branded document packaging.
- Content solutions that optimize critical business processes with AI, security and automated workflows.
SharePoint Premium also includes content processing and content governance services and solutions, with both new and existing capabilities from Microsoft Syntex and Microsoft SharePoint Advanced Management.
SharePoint Premium will expand content management in Microsoft 365 to help organizations get more value from their content throughout the lifecycle and bring content into the flow of work for information workers, IT pros, developers and more. Availability of services included in SharePoint Premium will roll out between now and the first half of 2024.
- Blog: Introducing SharePoint Premium – the future of AI powered content management and experiences
- Breakout: Transform your cloud content experiences: Introducing Microsoft SharePoint Premium
4.1.2. Microsoft Clipchamp and app integrations for Microsoft Designer now available
Microsoft 365 is enabling a new era of visual content creation for commercial customers. With Microsoft Clipchamp and Microsoft Designer , Microsoft 365 is democratizing video and image creation. Intuitive user experiences enhanced by AI make it simple for anyone to create compelling visuals – no experience required.
Microsoft Clipchamp Clipchamp is now generally available for commercial customers and can be accessed by users licensed for Microsoft 365 Enterprise (E3 and E5) and Business (Standard and Premium) suites.
Additionally, Clipchamp Premium features and licensing will become available in December 2023. These features will include 4K exports, organization brand kit capabilities and premium stock content (audio, video and graphics), with more premium features being added in the coming months, including AI-powered features. Customers will also be able to purchase Clipchamp independently of Microsoft 365 as a standalone license.
Microsoft Designer Designer is first coming to commercial customers through integrations in various Microsoft 365 apps, so users can create images within the flow of work. Microsoft 365 Enterprise (E3 and E5) and Business (Standard and Premium) users can now use Designer in Edge. Designer in Teams for creating announcement banners will become available in the coming weeks, and Copilot in Microsoft Designer for use in Word, PowerPoint and Whiteboard will be available by the end of 2023. A preview of the Designer app will become available to commercial customers in the coming months, followed by general availability later next year.
- Pre-recorded: Visual Content Creation for Everyone: Clipchamp and Designer
- Product roundtable: Microsoft Stream and Clipchamp
4.1.3. Microsoft Loop now generally available with updated features
The Microsoft Loop app is now generally available for the web and mobile (iOS and Android) for commercial customers. The Loop mobile app (iOS and Android) is generally available for consumer customers as well. Full functionality is available to business customers as part of their Microsoft 365 E3, E5, Business Standard and Business Premium Licenses.
Recent improvements and new capabilities for the Loop app include:
- Workspace status: Currently in private preview, Workspace status will provide information such as upcoming deadlines and status changes to provide a simple overview and help people know where to direct their attention.
- Workspace descriptions: When a workspace is created, Loop will intelligently surface files and documents that may be related to the project. A more descriptive prompt can be included with the workspace title to return more refined results and kick off a workspace with everything that’s needed. This capability is in preview.
- Power Automate: Loop will integrate Power Automate to help simplify task tracking and project management. A new rule can be created in a Loop table enabling automatic notifications to be sent when the table is updated. This automation will help teams stay on track while saving time to focus on what matters. This capability is in preview.
- Start a workspace from Teams: After a Teams meeting, a Loop workspace will be able to be created and will be automatically populated with the related documents and notes from the meeting, helping a team get started on their project easily. This capability will be in preview by the end of 2023.
- See also: 5.8.1. New capabilities in Microsoft Copilot for Microsoft 365
- Demo: More effective meetings and collaboration with Loop + Teams
- Pre-recorded: Microsoft Loop: Transforming the way we work together
- Product roundtable: Microsoft Loop: help transform the future of co-creation
- Product roundtable: Microsoft Loop integrations with apps outside of Microsoft 365
- Product roundtable: Microsoft Loop: compliance and admin controls
4.1.4. New updates to Microsoft 365 for frontline workers
Microsoft 365 for frontline workers helps support employee experiences in communications and operational efficiency. New updates to the tool include:
Copilot (formerly Bing Chat Enterprise) with commercial data protection will be available to all Microsoft Entra ID users starting with F3 licenses in December. This new tool will bring AI-powered web searches, answers and content generation for frontline workers. And with commercial data protection, frontline workers will be able to make smart requests, like looking up inventory or summarizing large internal documents, without worrying they are sharing company data.
Shifts plugin for Microsoft Copilot for Microsoft 365: Frontline managers will be able to quickly get a list of important items, specific to their team and location, to speed up time-consuming tasks such as covering shifts and onboarding new employees. Shifts plugin for Copilot for Microsoft 365 will use prompts to retrieve insights for frontline managers, leveraging data from the Shifts app, in addition to user and company data it has access to, like Teams chat history, SharePoint, emails and more. This feature will be generally available in December 2023.
Deploy and manage frontline teams with dynamic membership capabilities: Generally available in December, admins will be able to better deploy their frontline by provisioning frontline worker accounts and their teams with group membership in the Teams admin center. Using dynamic groups from Microsoft Entra ID, admins will automatically be kept up to date as people enter, move within or leave the organization. This dynamic membership makes it easy to set up a consistent Teams channel structure to optimize strong frontline collaboration from day one.
Deploy Shifts at scale: A new deployment tool to deploy and manage shifts for frontline workforce across multiple locations in the Teams admin center will be in preview in December. Users will be able to standardize time-off reasons, schedule groups and shift settings across all frontline teams.
Simple authorization with domain-less sign-in: Frontline workers will be able to sign in to Teams faster using only their employee ID, without having to type long domain names. This will be simple for frontline workers to use and easy to manage and deploy to reduce distribution and management overhead for IT admins. This feature will be available in preview in early 2024.
- Breakout: Reimagine the frontline with next-generation AI and fast deployment
- Breakout: Transforming service organizations with generative AI
4.1.5. The new Microsoft Planner brings together to-dos, tasks, plans and projects
Microsoft is bringing together Microsoft To Do, Microsoft Planner and Microsoft Project for the web into a single, unified experience called Microsoft Planner . This experience will first be available in the Planner app in Microsoft Teams in spring 2024, followed by web experiences later in 2024. Powerful, collaborative, scalable and assisted by next-generation AI, the new Planner will help everyone effectively manage their work and achieve their goals.
The new Planner experience will scale easily from collaborative task management to robust project management, empowering everyone, from information workers to frontline workers to project managers, so they can manage their work in one place and accelerate business outcomes with AI-enabled capabilities. Benefits will include:
- The ability to easily find tasks, so users can focus on the tasks they need to get done today.
- Helping users work the way they want. Whether for individual tasks or team initiatives, Planner will enable them to choose the approach that’s best for their needs.
- Creating project plans with powerful scheduling and resourcing tools. As plans evolve, Planner will offer a menu of capabilities to meet unique business needs.
- Copilot in Planner will help users get started with a plan faster with simple prompts and as the plan evolves, it will add goals and intelligently suggest new tasks to help keep users informed on progress.
In addition, the Tasks by Planner and To Do app in Microsoft Teams is being renamed Planner. Microsoft Project for the web will also be renamed over the coming months. Users of Project for the web can continue to use and enjoy the features they know, under the new name Planner.
The new Planner app in Microsoft Teams with premium work management capabilities and AI-powered experiences will be generally available in spring 2024, with the web experience coming later.
- Discussion: The future of AI-enabled work management with Microsoft Planner
- Demo: Using Copilot in Microsoft Planner for simple and advanced projects
- Pre-recorded: The new Microsoft Planner: Bring together to-dos, plans and projects
4.2. Microsoft Teams
4.2.1. immersive spaces in teams generally available in january.
Immersive spaces in Microsoft Teams, currently in preview, will be generally available in January 2024. Immersive spaces in Teams will bring the power of Mesh into the place where people work every day – Microsoft Teams. From the View menu in a Teams meeting, users can easily transform a 2D meeting into an immersive 3D experience.
Key capabilities in immersive spaces in Teams will include:
- Avatars: Choose an avatar previously built for standard 2D Teams meetings or create a new one. Avatars are easy to customize to reflect a person’s appearance, style or mood for the day.
- 3D environments: Choose from one of the ready-made 3D environments based on meeting needs, whether it’s a big team, social gathering or a small round-table discussion.
- Seat assignments: Select where to sit in a meeting or event to drive connections with others.
- Spatial audio and audio zones: Have multiple, simultaneous conversations and communicate effectively in subgroups without talking over each other.
- Interactive activities: Play built-in interactive games within immersive spaces. Designated areas include spaces to roast marshmallows, throw beanbags, answer icebreaker questions and more.
- Live reactions : Use live reactions, such as hearts, thumbs up or clap, during discussions.
- Blog: Learn more about what’s new in Microsoft Teams.
- Breakout: Get ready for the future of work with Microsoft Teams
4.2.2. Microsoft Mesh generally available in January
Microsoft Mesh , currently in preview, will be generally available in January 2024. With Mesh, users will be able to create custom immersive spaces tailored to specific business needs, such as employee events, trainings, guided tours and internal product showcases. Using a no-code editor, customers will be able to easily customize an immersive event or the Mesh toolkit to leverage the power of Unity for fully customizable experience.
With the Mesh editor, users will be able to customize immersive experiences to address the unique needs of the event without writing a line of code. Event creators will be able to select from a set of ready-to-use immersive spaces, customize them by adding images, videos and screen share in a shared 3D canvas, and have them show up in an event in an orchestrated way. Once these objects are added, change the size and position, or put the video on loop so it fits right into the event. These customizations can then be saved as a template for anyone in the organization to reuse.
At general availability, additional capabilities will make it easier for speakers to interact with attendees when hosting immersive events in Mesh. Event organizers will be able to facilitate a Q&A session by enabling attendees to raise hands. Organizers will see the list of hands raised, in order, and will be able to call on participants, engaging them directly. When called on, attendees will be effectively seen and heard by everyone in the event. This makes immersive events more effective and engaging, and brings elements of real-life town hall experiences where organizers can facilitate a Q&A during an event.
The ability to customize immersive spaces in Mesh will be available in Teams Premium.
4.2.3. New features and enhancements in Microsoft Teams
Microsoft Teams enables effective collaboration and communication for more than 320 million users around the world. New features in Teams bring a smarter, more personalized and simpler experience.
New Teams meeting features include:
- Voice isolation in Teams meetings and calls: This AI-driven feature is an advanced noise suppression capability that leverages user’s voice profiles and suppresses other people’s voices in the background during a Teams meeting or call. AI in Teams will recognize an individual’s voice and filter only their voice in Teams meetings and calls. This feature rollout has begun and will be generally available in early 2024.
- Decorate your background: Meeting participants will be able to use generative background effects in Teams to show up at their best – even when the space they’re working from isn’t at its best. With Decorate your background, meeting participants can use the power of AI to generate a background that decorates and enhances their real-world room, by cleaning up clutter or adding plants to a wall. This feature will be available early next year in Teams Premium.
Teams chat and channels enhancements, which are rolling out now through January 2024 include:
- Customize default reactions: Users will have the flexibility to change and select their default emoji reactions in Teams chat, as well as reduce the number of reactions in their interface.
- Forward chat: Users will be able to easily share a message received with another colleague by simply clicking on the message and selecting forward from the menu.
- Group chat profile picture: Users will have a better way to reflect their team and group chat topic with custom pictures. Group chat members can upload an image or use pre-selected illustrations and emojis.
- Loop components in channels: When composing a post in a channel, users will be able to easily co-create and collaborate on Loop components such as tables, lists, progress trackers and more.
- Channel announcement background: Users will be able to create a personalized announcement background that harnesses creativity and engages teams in new ways. Users will be able to add images, type a description or use the power of AI to generate a personalized background. Creating an image using generative AI will be available in Teams Premium and Copilot.
New Teams Phone features include:
- Private line: Now in general availability, users can have a private second phone number for a select set of callers so they can make calls directly to a specified contact, bypassing delegates, admins or assistants. Inbound calls to the private line will be distinguished by a unique ringtone and notification.
- Protected voicemail: To ensure users don’t miss important voicemails with sensitive information, users will now receive notifications for protected voicemails in the Calls app in Teams with a link to access the voicemail securely in the Outlook web app. This is now generally available.
- New Teams Phone offers in India: For customers who have employees based in India, Microsoft is working with local operators – Airtel, Tata Communications Limited and Tata Tele Business Services – which will be launching their Teams Phone-powered solutions in compliance with regulatory requirements in the market. These solutions will provide employees with greater calling flexibility and will support work-from-home scenarios as well.
Updates to create a simpler and easier Teams experience include:
- Microsoft Teams web experience: Microsoft recently released the new Teams app for Windows and Mac, and now, the new Teams web experience is generally available for web customers who use Microsoft Edge or Google Chrome, providing a faster and simpler Teams experience to help users save time and collaborate more efficiently. New Teams is reimagined from the ground up to deliver up to two times faster performance while using 50 percent less memory. From redesigning channels and simplifying notifications to enhancing personalization options, the new Teams web experience offers a simpler and responsive user experience that’s easier to navigate and accomplish with fewer clicks.
- New Teams keyboard shortcuts: New keyboard shortcuts in Teams will save users time when composing a message, navigating in the app, taking an action in a chat and channels or changing a setting. New shortcuts include Alt+Shift+D to set status to Do not disturb and Alt+Shift+R to reply quickly to the latest message received. The full list of keyboard shortcuts is available, and new shortcuts will be generally available by January 2024.
- Code block enhancements: New enhancements to code blocks will make it easier for users to send code in Teams. Users will be able to start a code block using the entry point in format options or by using markdown. Users can pick/change the code language for syntax highlighting when pasting or writing code. This update will be generally available by January 2024.
- Simplified notifications: Users will be able to clear notifications with a single click in activity, chat and channels and customize Teams notification settings to quickly identify what matters most. This update will be generally available in January 2024.
- Manage your teams and channels: Users can easily manage the channels list in Teams to focus on what matters most. When starting a new collaboration space, users will be prompted to create a channel, and when joining a new team users can choose only the channels they would like to show in the channel list. When a channel is no longer active or relevant, a user can archive the channel, and it will be hidden and closed for further action, but the information can still be accessed. This update will be generally available in January 2024.
- Private team discovery: Admins will be able to make private teams discoverable in their organization. Users can view and search for these private teams through the “Join team gallery” in their client. This update will be generally available in January 2024.
- Shared channels enhancements: To make it easier to collaborate with external stakeholders, admins are able to set up a form that captures the user’s request to add an external member who is not from an approved organization. Team members can also create shared channels , if the channel owner permits it. Users can share direct links to a channel, post or reply . This update will be generally available in January 2024.
- Blog: Learn more about what’s new in Microsoft Teams .
4.2.4. New features streamline IT management for Teams experiences
Several new features have been developed for Microsoft Teams to help streamline processes for IT admins and organizations and include:
Teams Rooms low-friction deployment: Teams Rooms can now be deployed using Windows Autopilot , in private preview, reducing deployment times from days to just a few hours. For devices that can’t use Autopilot, Microsoft is introducing a simplified deployment option with one-time passwords (OTPs), 16-digit codes that will eliminate the risk of sharing access credentials. OTP will be generally available in November. Remote Access is coming to the Teams Rooms Pro Management portal and will allow remote troubleshooting and proactive maintenance, monitoring device health and preventing issues before they affect meetings.
Simplifying Teams Phone deployment with shared calling: Shared calling allows admins to enable groups of users to make and receive public switched telephone network (PSTN) calls using a shared phone number and calling plan. With shared calling, a single phone number and calling plan can be shared across a team of users whether that’s a team of 10 people in a small office or 10,000 users in an enterprise department – they just need to have a Teams Phone license through either Microsoft 365 E5 or Teams Phone Standard. This feature is generally available.
Advanced Collaboration Tools: Advanced Collaboration Tools in Teams Premium empower IT admins to provide a more secure and self-regulated environment. Priority account chat controls , in preview, empower users to manage unwanted internal communications via policies setting. Users are notified about chats from new contacts, giving them a choice to accept or block the conversations. Advanced collaboration analytics , now generally available, offers deep insights into external collaboration behaviors. It empowers Teams admins to facilitate successful collaboration and mitigate potential risks of external collaboration.
- Blog: Learn more about Advanced Collaboration Tools.
- Blog: Learn more about updates to Teams Phone.
- Pre-recorded: Microsoft Teams Phone: the smart calling solution for Teams
4.3. Microsoft Viva
4.3.1. updates to viva engage and viva amplify.
Viva Engage and Viva Amplify are tools to allow organizations to connect everyone through employee communities and conversations to build meaningful relationships and give company messages needed volume. Updates to both platforms include:
- Multi-tenant organization (MTO) communication: Viva Engage supports MTO cross-tenant communication to make it easier for leaders to communicate and engage at scale to build community across the organization. Leaders can send a storyline announcement across tenants to share the same story to all stakeholders. Responding, reacting and analytics will be supported for cross-tent posts. This update is now generally available.
- Publish from Viva Amplify to Viva Engage: This integration will allow for publishing from Viva Amplify to Viva Engage communities and storylines. It will also incorporate content shared to Engage within reports in Viva Amplify. This update will be available in private preview early next year.
- Viva Goals and Viva Engage integration: The integration will bring mission, alignment and results into communities and conversation by helping customers build communities around goals, view goal progress from Viva Goals directly in Viva Engage and create divisional communities within Viva Engage designed to help leaders better communicate top-down interactions and initiatives, including delivering praise based on goal progress and achievement. This update will be generally available in the first half of 2024.
- Seeded and AI-connected knowledge in Answers: This update will allow users to use AI to generate questions and answers from existing files and import them into Answers . Also, AI will automatically route open questions to people who might have the answer. Users can identify the right topics for a question, see top similar responses before publishing, and aid in routing to experts. This update will start rolling out later this year.
- Viva Amplify updates: Viva Amplify will include publication templates for campaigns, adds localization in more than 50 languages and will include better reporting from system data to incorporate custom data from HR tools into filtering and reports. This update will be generally available in December.
- Discussion: AI in Viva Engage
5. Microsoft Copilot 5.1. Azure Data
5.1.1. introducing copilot in microsoft fabric.
Microsoft Fabric is being infused with Azure OpenAI Service at every layer to help customers unlock the full potential of their data, enabling developers to leverage the power of generative AI against their data and assisting business users to find insights in their data.
With Copilot in Microsoft Fabric in every data experience, customers will be able to use conversational language to create dataflows and data pipelines, generate code and entire functions, build machine learning models or visualize results. Customers will even be able to create their own conversational language experiences that combine Azure OpenAI Service models and their data, and publish them as plugins.
Copilot in Fabric, now in preview, will build on Microsoft’s existing commitments to data security and privacy in the enterprise. Copilot inherits an organization’s security, compliance and privacy policies. Microsoft does not use an organization’s tenant data to train the base language models that power Copilot.
- Keynote: AI transformation for your organization with the Microsoft Cloud
- Breakout: Build powerful AI apps with Copilot in Microsoft Fabric
5.2. Azure Management & Operations
5.2.1. microsoft copilot for azure boosts productivity with generative ai.
Microsoft Copilot for Azure, now in preview, is an AI companion that will simplify how users design, operate, optimize and troubleshoot applications and infrastructure from cloud to edge.
With Copilot for Azure, customers will gain new insights into their workloads, unlock untapped Azure functionality and orchestrate tasks across both cloud and edge. Copilot will leverage large language models (LLMs), the Azure control plane and insights about a user’s Azure and Arc-enabled assets. All of this is carried out within the framework of Azure’s steadfast commitment to safeguarding the customer’s data security and privacy.
- Blog: Simplify IT management with Microsoft Copilot for Azure – save time and get answers fast
- Breakout: Simplifying cloud operations with the Azure Portal
- Discussion: Leveraging AI to manage your workloads with new Azure management tools
5.3.1. Bringing Copilot to everyone
The efforts to simplify the user experience and make Copilot more accessible to everyone starts with Bing, Microsoft’s leading experience for the web. Bing Chat and Bing Chat Enterprise will now simply become Copilot. For Microsoft Entra customers, Copilot in Bing, Edge and Windows adds commercial data protection. Copilot will be out of preview and become generally available starting December 1. Over time, Microsoft will also expand eligibility of Copilot with commercial data protection to even more Entra ID (formerly Azure Active Directory) users at no additional cost.
5.4. Copilot for Sales and Copilot for Service
5.4.1. introducing microsoft copilot for service.
The power of generative AI is being extended to business user workflows with the introduction of Microsoft Copilot for Sales and Microsoft Copilot for Service. These copilots are designed to help organizational functions reinvent business processes with AI and stay in the flow of work. Both Copilots will include a license for Microsoft Copilot for Microsoft 365, allowing for generative AI support across all workflows to surface business insights, accelerate decision-making and create materials, such as briefs or customer communications, all with context rooted in data from existing CRM systems of records, knowledge repositories, third-party applications and Microsoft 365.
Microsoft Copilot for Service is a new business copilot that will help extend existing contact centers with generative AI to boost agent productivity. Copilot for Service will include Copilot for Microsoft 365 and will integrate with third-party CRM and contact center solutions. Organizations will be able to simply point to their data, such as trusted websites, knowledgebase articles, files and, more importantly, data sources from their existing contact center, and in a few minutes unlock generative AI-powered conversations across all their data. Copilot for Service will include out-of-the-box integrations with Salesforce, ServiceNow and Zendesk, and will be extended to other systems with more than 1,000 pre-built and custom connectors.
During customer interactions, agents will be able to ask Copilot for Service questions in natural language and receive relevant insights pulled from customer and product data sourced from knowledge repositories. This will improve the speed at which agents can meaningfully assist and resolve cases, providing better experiences for both agents and customers. And since Copilot for Service will include Copilot for Microsoft 365, these productivity resources will be delivered in the tools agents already use every day, such as Outlook and Teams, as well as third-party agent desktops of choice like Salesforce, ServiceNow, Zendesk and others.
- Demo: Discover how Dynamics 365 Copilot empowers service teams
- Discussion: Learn how Dynamics 365 Copilot unlocks modern service experiences
5.4.2. Microsoft Copilot for Sales will boost enhancements and integration partnerships
Microsoft Sales Copilot , already generally available and being used by 15,000 organizations including Rockwell Automation, Netlogic and Avanade, helps sellers create more personalized customer engagements and increase productivity. Building off this success, Microsoft is today announcing the new Microsoft Copilot for Sales, which will include licenses for Microsoft Copilot for Microsoft 365. Copilot for Sales will help sellers by leveraging the power of generative AI across Microsoft 365 apps, CRM systems of record and other relevant third-party data sources via Power Platform connectors.
Copilot for Sales capabilities will include access to Copilot for Microsoft 365 that will bring the power of both Copilots to allow sellers to harness the benefits of generative AI across all workflows and productivity surfaces, such as Microsoft Teams and Word in a newly integrated experience.
- Meeting recaps in Teams will combine insights from Copilot for Sales and Copilot for Microsoft 365 to surface action items, task creation and conversation KPIs.
- Copilot for Sales will be able to create a meeting preparation brief by pulling relevant information into a Word document, including a summary of the account and opportunity, names and titles of meeting participants, open tasks related to the opportunity, relevant highlights from recent meetings, relevant email threads and more.
- Blog: Introducing new Copilot experiences to boost productivity and elevate customer experiences across the organization
- Breakout: Transform customer experience with Dynamics 365 and next-generation AI
- Discussion: Discuss the extensible architecture of copilots in sales and marketing
- Demo: See the latest enhancements with Copilot in sales and marketing
5.5. Dynamics 365
5.5.1. new capabilities and integrated offerings for copilot in dynamics 365 field service.
Dynamics 365 Field Service, which helps businesses transform their service operations and improve customer experiences through AI, mixed reality and the Internet of Things (IoT), has several new Copilot capabilities and integrated offerings. They include:
Improved technician productivity with next-generation AI In preview beginning December 2023, frontline workers will be able to access key work order information by asking Copilot questions within Microsoft Teams. Using natural language, they will be able to simply state what they need to receive specific information related to work orders in Dynamics 365 Field Service, including status updates, parts needed or instructions to help them get the job done.
Additionally with the Dynamics 365 Field Service app in Teams , generally available in December, frontline workers will be able to view and edit their work orders in the flow of work in Teams. Copilot will also become generally available to assist frontline managers with work order scheduling in Microsoft Teams, saving time and effort to find the right worker for the job.
With the preview of new Copilot capabilities in Dynamics 365 Field Service mobile in December, frontline technicians will be able to quickly get a summary of key points in a work order without having to navigate through a series of tabs. They will be able to swiftly make progress updates by simply speaking to Copilot and describing what they did. Copilot will provide suggestions to efficiently check off service tasks, add notes and update product quantities as well as statuses, accelerating data-entry so technicians can focus on providing great customer service.
Streamlined manager workflows with next-generation AI Introduced in preview earlier this year and becoming generally available in December, Copilot capabilities in the Dynamics 365 Field Service Outlook add-in for frontline managers will streamline work order creation with relevant details pre-populated from emails, as well as optimize technician scheduling with data-driven recommendations based on factors such as travel time, availability and skillset, without switching apps. Relevant work orders are surfaced within this experience for managers to review before creating new work orders, and these can be easily rescheduled or updated as customer needs change.
Introduced in preview earlier this year and now generally available, a redesigned Dynamics 365 Field Service work order management experience brings important information front and center – reducing the number of clicks for key tasks by more than a third. Additionally, Copilot is available in preview within this experience to provide frontline managers intelligent recaps so they can stay up to date without having to navigate through all the information in a work order.
Efficiencies with more integrated offerings Seamless financial and inventory data flow between Dynamics 365 Field Service, Dynamics 365 Finance and Supply Chain Management will help ensure the frontline and back-office operations stay in sync. By syncing real-time price and cost information from work orders, and automatically updating financial and inventory data as work orders are executed, this integration, in preview, will reduce the effort required to connect these Dynamics 365 apps.
Beginning December 2023, Dynamics 365 Field Service customers can get access to Dynamics 365 Guides and Dynamics 365 Remote Assist at no additional cost. Users will be able to create guides to provide technicians with step-by-step instructions for key tasks and enable real-time collaboration with remote experts via mobile or HoloLens 2 devices when additional assistance is needed. Also, customers can purchase Dynamics 365 Field Service Contractor to provide essential work order management functionality to vendors as they scale field service operations to meet demand.
- Blog: Learn more about these updates .
5.5.2. New Copilot in Dynamics 365 Sales features
Two new Copilot in Microsoft Dynamics 365 Sales experiences will allow sellers to interact with their data using natural language and include:
- The ability for sellers to use natural language with Copilot in Dynamics 365 Sales to get contextual insights and recommendations for leads and opportunities, in addition to using pre-built prompts previously announced. This feature is in preview.
- A full-screen view for Copilot in Dynamics 365 Sales where sellers will be able to use natural language or pre-built prompts to gain a quick understanding of customers, deals, meetings, forecasts and more. This experience will roll out starting in November in Asia and Europe and will continue into other markets at later dates.
- While Copilot in Dynamics 365 Sales already integrates with popular seller tools like People.ai, Copilot now takes that a step further with extensibility for independent software vendors (ISVs) who can leverage Power Platform Connectors to integrate into Opportunity Summary view in Outlook . These are already being used by DocuSign and PROS to bring relevant contract details, and pricing and quotes, respectively, into Outlook .
5.5.3. New features, enhancements and partnerships for Copilot in Dynamics 365 Customer Insights
Dynamics 365 Customer Insights, which uses real-time data and next-generation AI to help marketers understand their customers and create personalized experiences, has several new features and updates, all in preview over the next month and include:
Deeper customer understanding and connected marketing and sales capabilities . Marketers will be able to now qualify their leads using metrics, such as engagement scores, and easily hand off to their sellers to help ensure marketing and sales teams maximize the opportunity pipeline and increase win rates.
Sellers will benefit from AI-generated customer profile summaries from their customer data platform application that summarize key components of customer profiles, including demographic, transactional, behavioral and analytics data, to generate key insights for quick customer understanding.
Integration partnerships with Optimizely . Organizations will be able to leverage segments and journeys built in Dynamics 365 Customer Insights in Optimizely to create campaigns that can be personalized in real-time without writing any code. With bi-directional integrations, organizations will be able to provide hyper-personalized omnichannel experiences to their customers across web, social, email and offline channels.
5.6.1. Simplify management of Edge for Business with Copilot in Microsoft 365 admin center
Copilot will be coming to the Edge management service via Copilot in Microsoft 365 admin center to provide an intelligent management experience for Edge for Business. Copilot in Microsoft 365 admin center will provide a shared experience across other Microsoft 365 admin centers as well, including SharePoint, Teams and more, to create a cohesive Copilot experience across Microsoft 365 management services for IT admins.
Currently available in private preview, Copilot in the Edge management service will give guidance to IT admins on recommended policies and extensions for their users. More capabilities will be added in the future.
- Product roundtable: Shaping the future of admin experiences in Copilot
5.7. Industry Cloud
5.7.1. copilot in dynamics 365 guides brings mixed reality and ai to real-world operations.
Copilot in Microsoft Dynamics 365 Guides is a new capability that will bring the power of mixed reality and generative AI to real-world operations, enabling frontline workers to address issues faster with the combined power of AI, mixed reality and the Microsoft Cloud.
Copilot will enable field service technicians and operators in manufacturing and service industries to converse with AI using natural language. The AI, powered by Azure OpenAI Service, will pull from a curated set of documentation to serve up relevant information based on the task at hand. That information could include things like help with troubleshooting, step-by-step procedural guidance, detailed information, service work history, current device readings or completion of work order forms. Holographic overlays will allow workers to accomplish tasks seamlessly without breaking the flow of work.
With Copilot in Dynamics 365 Guides, organizations will be able to:
- Improve operational efficiency by enabling workers with the right information at the right time within the flow of work. With the ability to point and ask, frontline workers will be able to point at a component on a machine that needs service or repairs, ask questions and receive intelligent answers that help them resolve issues faster and reduce on-the-job frustration.
- Accelerate the onboarding of new workers and upskill existing workforce effectively with an immersive and intelligent learning platform in mixed reality, which improves confidence in executing a task while reducing cognitive overload. The capability will help create personalized, role-based training to accelerate onboarding time and reduce the need for retraining.
- Seamlessly capture, integrate and democratize institutional knowledge into existing workflows and access detailed repair guidance, historical service data and best practices in real time, helping to ensure consistent, high-quality maintenance while minimizing errors.
- Enable frontline workers to access and update forms and systems of record associated with their work and the equipment they’re working on, without leaving the flow of work.
Copilot in Dynamics 365 Guides is currently in private preview on HoloLens 2.
5.8. Microsoft Copilot for Microsoft 365
5.8.1. new capabilities in microsoft copilot for microsoft 365.
Microsoft Copilot for Microsoft 365 is an AI assistant at work. It automatically inherits an organization’s security, compliance and privacy policies for Microsoft 365. Data is managed in line with Microsoft’s current commitments. I t’s integrated into the Microsoft 365 apps millions of people use every day —Word, Excel, PowerPoint, Outlook, Teams and more. Copilot for Microsoft 365 is already used by tens of thousands of enterprise users in the Early Access Program, including customers at companies like Visa and KPMG and is now generally available for enterprise customers.
New capabilities for Copilots include:
Microsoft Copilot Dashboard As organizations invest in AI productivity tools, leaders need clear visibility into how employees are leveraging AI and how is it transforming their organization’s productivity. They need to ensure these tools are being used to their full potential. Microsoft Copilot Dashboard, powered by Microsoft Viva, will help leaders answer how copilot is affecting their organization and changing the way people work. This dashboard will enable leaders and organizations to plan their copilot readiness, drive adoption and measure the impact of Copilot for Microsoft 365. It’s available as part of Microsoft 365 subscriptions starting today and coming to the Viva Insights app in Teams and on the web in December 2023.
Copilot in Microsoft Teams With the combined power of Copilot in Microsoft Whiteboard and Copilot in Teams, meeting participants will be able to visualize meeting discussions. In addition to live summarization and notes during a Teams meeting, coming next year, Copilot will be able to visualize spoken discussion points and organize them in Whiteboard . Copilot will be able to turn meeting participants’ spoken ideas and topics into a visual collaboration space in Whiteboard that can be shared across all meeting participants. Copilot will be able to suggest more ideas to add to Whiteboard as the meeting conversation happens. The captured content on Whiteboard will be saved as a Microsoft Whiteboard file, accessible via Teams, OneDrive for Business and directly via the Whiteboard app. For Copilot licensed users, a Copilot-generated summary of the Whiteboard will be able to be shared as a Loop component in Outlook, Word, Teams and more.
Copilot in Teams meetings will be able to take notes throughout a meeting and share with all participants during a Teams meeting. Users will be able to add shared meeting notes and agenda items in Collaborative notes. Collaborative notes are Loop components which stay in sync across all the places they’ve been shared. When enabled before the meeting, Copilot will automatically take live notes within Collaborative notes, so attendees can focus on the discussion. These notes will be shared with meeting participants and Copilot users will be able to ask for more specific notes like “Capture what Beth said as a quote.” Only one meeting participant will be required to have the Copilot license for the notes to be viewable and editable to all meeting participants. This capability will roll out next year in the Teams desktop and web app.
Copilot in Teams can be used during Teams meetings without retaining transcription. In the Teams admin center, admins can now assign a policy to enable Copilot with or without transcription, and meeting organizers can set their preference before the meeting starts. When enabled without transcription, Copilot can be used during the meeting to ask any question about the meeting, and after the meeting ends no transcript or Copilot interactions will be retained. Since no transcript is retained after the meeting, neither intelligent recap nor Copilot will be available after the meeting when this policy is enabled. This capability is generally available.
Copilot in Microsoft Teams channels, now generally available, enables users to quickly synthesize key information from a past conversation and have it summarized, with organized key discussion points. The summary of information includes citations, to keep users aware of the source.
The new Copilot compose box in Teams chat and channels, now generally available, serves as a writing assistant to help edit messages. Copilot can help rewrite messages, adapt the tone and modify length of the message before the user sends it.
In July, Copilot in Teams Phone was announced, allowing users to get real-time summarization and insights during calls. Users will soon be able to access Copilot from the Calls app in Teams to generate summaries, action items or ask any specific questions about the conversation after a call has wrapped up. The Copilot post-call experience in the Calls app will be supported for both VoIP and PSTN calls.
Copilot in Microsoft Outlook Copilot in Microsoft Outlook will make it possible to do more in less time and with less effort. The new Copilot features help will help users prepare and schedule meetings.
Here is how it will work:
- Preparing for meetings : Copilot will be able to generate a detailed summary of upcoming meetings based on the invite, related emails and attached files, making it easier to get ready for the meeting in minutes.
- Scheduling meetings : Copilot will make it easy to schedule a follow-up meeting or any other meeting from an email summary. Either click the suggested action or chat with Copilot to set up a meeting. Copilot will be able to create an agenda, a title, a list of attendees and a context for the meeting. It will also be able to find the best available times for everyone.
Copilot in Outlook will begin to roll out in early 2024.
Copilot in Microsoft Loop Copilot in Loop will be able to use information from a document linked in the prompt to generate even more relevant responses. This capability is currently rolling out for customers with a Copilot license and will be available for the web. It will also be able to intelligently adapt previous Loop pages to new projects, creating custom templates. This capability will roll out next year for customers with a Copilot license, and the updates are in preview.
Copilot in Microsoft Word Catchup and comments in Copilot in Word will help users comprehend what has changed in a document by asking questions like, “How do I see what has changed in this document?” to reveal changes and revisions made by anyone who has accessed the document. It will also use information from Microsoft Graph to create more personalization by considering user preferences, interests and other information. These updates will be available early 2024.
- Blog: Introducing Microsoft Copilot Studio and new features in Copilot for Microsoft 365
- Blog: Microsoft Loop: built for the new way of work, generally available to Microsoft 365 work accounts
- Keynote: Unlock Productivity with Microsoft Copilot
- Breakout: Transform copilot development
- Demo: Build conversational experiences with generative AI
5.8.2. Extend and enrich Microsoft Copilot for Microsoft 365 with plugins and connectors
With Microsoft Copilot for Microsoft 365, users will be able to expand skills and knowledge of Copilot via plugins and Microsoft Graph connectors, now in preview. Users will be able to install and enable plugins to extend the game-changing abilities of Copilot for Microsoft 365 with internal and third-party applications they use every day.
Microsoft 365 admins will be able to deploy Microsoft Graph connectors and have their data semantically surfaced by Copilot in response to user prompts. Users will benefit from fast, semantic retrieval of relevant data that is managed within the data governance boundaries of their Microsoft 365 tenant.
Copilot-generated responses will provide the right citations and attributions that will help users trace and access the source content. Several plugins and Graph connectors, including Web search powered by Bing, Microsoft Dataverse, Jira, Trello, Mural, Confluence, Freshworks and Priority Matrix and more are now available. Also, customers like KPMG, Air India and Avanade, among others, are developing custom plugins for their business needs, providing them with the benefits of Copilot for Microsoft 365 with their line-of-business applications for their users.
Plugins can be installed by users via the app store in Teams, Outlook and Microsoft365.com, and Graph connectors can be installed by IT from Microsoft 365 admin center.
New controls in Microsoft 365 admin center, now generally available, enable admins to discover and manage plugins for Microsoft Copilot, a set of tools that people help achieve more using AI . Microsoft 365 admins can also now quickly and easily identify available plugins for Microsoft Copilot, enabled for users in the tenant, and set appropriate access policies for user activation. Independent software vendor (ISV) and internally developed apps with plugin capabilities are automatically highlighted for admin review. Plugin access controls are centrally managed and can be set by user, group and tenant wide.
Developers will be able to build plugins for Microsoft Copilot via Teams message extensions to extend Copilot for Microsoft 365. Developers will be able to build or enhance Teams Message Extensions with Teams Toolkit for Visual Studio and Visual Studio Code and get plugins to extend Copilot for Microsoft 365. Message extensions must use manifest version 1.13 or above. Additionally, developers will be able to build Teams message extensions directly from APIs. API-based message extensions will extend Copilot in the future. This capability is in preview.
Copilot for Microsoft 365 developer sandbox is now available in private preview. Copilot for Microsoft 365 developer sandbox SKU will enable developers to build and test plugins and Graph connectors in a non-production tenant environment. The SKU is in private preview and available for purchase by independent software vendors (ISVs) in the Microsoft 365 TAP program and Copilot for Microsoft 365 customers .
Microsoft 365 and Copilot program to publish apps and plugins for Microsoft 365 is available in preview in Partner Center. Independent software vendors (ISVs) will be able to submit apps that include Copilot plugins and Graph connectors for Teams, Outlook and more to the new Microsoft 365 and Copilot program. Developers will be able to package plugins and Graph connectors into the same unified app manifest and publish their app to Partner Center. Upon app validation, these apps will be made available in Microsoft 365 admin center, where administrators can discover and enable for their users. Users can discover the apps and plugins through the app store in Teams, Outlook and Microsoft365.com.
- Blog: Learn more about Microsoft Copilot for Microsoft 365 plugins and connectors
- Blog: Learn more on the Microsoft 365 Developer blog .
- Breakout: Extend Microsoft 365 Copilot with your line of business apps and data
- Discussion: Ask early adopters: Building LOB plug-ins for Microsoft 365 Copilot
5.8.3. Introducing Copilot for Microsoft 365 admin in private preview
Microsoft Copilot for Microsoft 365 admin is a set of experiences that will harness the value of generative AI to boost Microsoft 365 admins’ productivity by streamlining daily work, so admins are empowered to focus on their strategic priorities, make faster decisions and maximize the value of their investments.
Copilot for Microsoft 365 admin is designed to help IT admins simplify their admin tasks, generate insights faster and get more out of Microsoft 365. Copilot can help admins navigate the vast array of tools, controls and configurations to quickly guide an admin to exactly where they need to go, offer up suggestions and guidance and simplify reporting and querying by surfacing information through natural language prompts – all in a seamless experience across Microsoft 365 admin center, specialized admin centers and more to meet admins where they are in the flow of work.
Copilot for Microsoft 365 admin for enterprise customers in the Microsoft 365 admin center, including Edge and Windows Update management service, Teams admin center, Exchange admin center and SharePoint admin center, is in private preview.
- Blog: Ignite 2023 – What’s New for Copilot and Microsoft 365 admins
- Demo: Empowering Microsoft 365 admins with new AI-powered experiences
- Pre-recorded: What’s new for Microsoft 365 admins
5.8.4. Introducing Microsoft Copilot Studio
Microsoft Copilot Studio is an end-to-end conversational AI platform that empowers users to create and customize copilots using natural language or a graphical interface. Copilot Studio is the perfect companion for Microsoft Copilot for Microsoft 365, putting IT in control, while ensuring predictability of responses for key topics.
With Copilot Studio, users will be able to easily design, test and publish copilots that suit specific needs and preferences. Users will be able to leverage the power of generative AI to dynamically create multi-turn answers over data and dialogues that are engaging and relevant for users, and to add specific conversations for predictable scenarios that require authored responses and workflows, such as expense management, HR onboarding or IT services.
Copilot Studio will also enable users to link copilots with the broader Microsoft Conversational AI stack through integrations with Azure AI Studio and additional Azure services. This will allow users to access advanced features, like speech recognition, sentiment analysis, entity extraction and more, while having full visibility and control with built-in governance.
Copilot Studio will provide an intuitive and integrated design studio experience for makers. With plugin builder, makers will be able to create, manage and test plugins, including Power Platform connectors and Power Automate flows. With prompt builder, makers will be able to create custom prompts, including ones that leverage generative AI.
Copilot Studio will enable users to create impactful conversational AI experiences and drive business outcomes. Users can now leverage Copilot Studio to customize Copilot for Microsoft 365, included in the Copilot for Microsoft 365 license.
- Blog: Learn more about Microsoft Copilot Studio .
- Blog: Learn more about updates to Microsoft 365 .
5.9. Microsoft Viva
5.9.1. viva and microsoft 365 chat integration, and copilot in viva updates.
Microsoft Copilot for Microsoft 365 enhanced with Microsoft Viva will provide AI-powered assistance for employee experience. Copilot will work across Viva data and applications as a single interface to guide employees, managers and HR leaders with self-service insights and experiences such as understanding team health, setting new priorities with OKRs (Objectives and Key Results) or upskilling for career growth.
This new integration will be available for customers who have deployed both Copilot for Microsoft 365 and the Viva suite and will be in private preview early next year.
Additionally, updates to Copilot experiences in Viva apps will help organizations better understand and engage with their workforce to improve performance and include:
- Copilot in Microsoft Viva Insights will enable leaders and their delegates to use natural language prompts to generate personalized, dynamic reports that answer questions about their teams and organization, and will simplify the query building process for analysts. This Copilot will be in preview early next year.
- Copilot in Microsoft Viva Goals will enable users to easily generate and refine goals with conversational AI and from existing strategy documents, as well as summarize goal progress to share with and across teams. This Copilot will be in preview in December 2023.
- Copilot in Microsoft Viva Engage will help inspire leaders and employees to post using AI-created conversation starters, prompts and images. It will also give leaders insight into employee sentiment, cultivate an environment of trust by tailoring and refining message tone and enhance the quality of questions being asked with suggested prompts. This Copilot will be in preview in January 2024.
- Copilot in Microsoft Viva Learning will allow users to easily create structured learning collections, find the right learning resources and summarize learning content using conversational AI. This Copilot will be in private preview for joint Viva and SAP SuccessFactors customers by the end of 2023.
- Copilot in Microsoft Viva Glint will enable leaders to summarize and analyze thousands of employee comments from employee engagement surveys and provide a fresh way to explore feedback by asking questions through natural language. This Copilot will be available for private preview in January 2024.
Additionally, see here for details around a new dashboard from Viva Insights that provides leaders and organizations visibility into Microsoft Copilot adoption and impact.
- See also: 5.8.3. Introducing Copilot for Microsoft 365 admin in private preview
- Breakout: Drive engagement and performance with Microsoft Viva
5.10. Power Platform
5.10.1. new experiences for copilot in power automate.
Microsoft is announcing new Copilot experiences in Power Automate for developers and orchestrators. With these new releases, Copilot in Microsoft Power Automate will span process mining, API automation, robotic process automation (RPA) and orchestration.
These new experiences, in preview, will enable users to discover and create UI and API automation faster, while streamlining productivity and insights into how they are run and managed and include:
- Using Copilot to assist with desktop flows: Copilot will be integrated into the Console and the Designer of Power Automate for desktop. Copilot will assist with desktop flows (RPA) by typing questions and getting relevant information and step-by-step instructions from documentation. Additionally, users will be able to generate scripts by describing what to do, and code will be automatically generated.
- Using Copilot to analyze automation activity: Admins, Center of Excellence (CoE) teams and business users and makers with access to flow run histories will be able to query past runs in natural language across their environment.
- Breakout: Improve Efficiency with Power Automate Process Mining and RPA
- Demo: Drive process optimization with Copilot in process mining
- Demo: New features in Power Automate desktop designer
- Discussion: Enabling your Automation CoE to Modernize with Power Automate Q&A
- Discussion: How Process Mining can drive business efficiency in various industries Q&A
5.10.2. Power Apps continues to advance IT governance, Copilot and modern experiences in app development
Power Apps gives developers tools to rapidly build modern, high-performing AI-powered apps for enterprise. AI implementation through Copilot, prebuilt templates and drag-and-drop simplicity allow everyone to do more with less. Extensibility through integrations enables professional developers to build without limits. As part of Microsoft Power Platform, Power Apps benefits from advanced security and governance in place, making it easy to deploy and govern at massive scale.
New features and updates for Power Platform include:
Advanced governance features in Managed Environments for Power Platform
- Groups and rules will empower IT admins to control their Power Platform environments at enterprise scale. Environment groups will allow admins to categorize their environments into group, and rule sets will allow admins to define a dedicated configuration for each of their environment groups. With groups and rules, admins will get more control with less effort. This feature is in limited preview, with preview planned for the end of the first quarter of 2024.
- Advisor in Managed Environments for Power Platform will provide IT with proactive recommendations and inline actions to govern and secure the platform more easily. This will include recommendations to clean up unused apps, lock down over-shared apps and add owners to abandoned apps. This feature is in preview.
Features to further accelerate building of Copilot-enabled apps
- Copilot for app users as sidecar is now ready for every user in the web player canvas app backed by Dataverse. Copilot will automatically work with the data in the app and provide users with insights and answers, without needing to set up from app makers. This is the easiest way to give the power of generative AI to users. This feature is coming to limited preview in mid-December and preview end of the first quarter of 2024.
- Makers can now also greatly extend and customize Copilot for app users embedded in apps through Copilot control. Microsoft Copilot Studio interoperability enables extensive customization that brings capabilities of Copilot Studio into every Copilot-embedded app. This feature will be in preview in the first half of 2024.
- For existing apps that are in managed environments and are missing their description, AI-generated app descriptions will be added automatically. For any new apps, when the makers are publishing, a draft of the app description will be presented for them. App descriptions will help end users find useful apps and IT admins understand their app landscape. This feature is in preview.
Making the creation of high-performing modern apps even easier
- Mobile native UI/UX: Mobile Power Apps will have native UI/UXwith smoother animations, faster performance and modern mobile interaction patterns. The improved performance and modern functionality will bring efficiency gains to everyone running apps on mobile devices, especially those executing high-volume actions. This update is in preview.
- With modern controls generally available and theming in canvas apps, makers will be able to easily create elegant, accessible, fast and reliable apps. This will cut down the need to write complicated files for usual scenarios and will accelerate development. This update is in preview.
- See also: 5.8.4. Introducing Microsoft Copilot Studio
- Breakout: Operating Power Platform at Enterprise scale
- Discussion: Generative AI for low code development Q&A
5.10.3. Power Virtual Agents is now part of Microsoft Copilot Studio
Introducing Microsoft Copilot Studio, built on the foundations of Power Virtual Agents and the broader Microsoft conversational AI ecosystem. Copilot Studio provides new ways to build copilots and extend Microsoft Copilot with the latest generative AI capabilities. With Power Virtual Agents capabilities joining Copilot Studio, the Power Virtual Agents name will no longer be used. Existing PVA customers will access the same capabilities, all within Microsoft Copilot Studio.
- Breakout: Revolutionize the way you develop conversational AI
- Demo: How to build your own copilot with Microsoft Copilot Studio
5.11. Security Copilot
5.11.1. microsoft purview capabilities in microsoft security copilot and embedding the security copilot experience in microsoft purview solutions.
Microsoft Security Copilot will be embedded into Microsoft Purview. Data security admins receive an average of more than 50 alerts per day and can only get to fewer than 60 to 70 percent of them . Likewise, compliance admins spend 60 percent of their time reviewing evidence collected in review sets. Additionally, there is also a steep learning curve from entry-level analysts to expert-level analysts to assess and remediate risks.
With Security Copilot embedded in Purview, customers will be able to quickly generate a comprehensive summary of alerts and information to accelerate investigation and response, upskill talent via guided responses to navigate through information efficiently and use natural language to define search queries in eDiscovery to enable faster and more accurate search iterations by eliminating the need to use keyword query language. Embedded scenarios will be surfaced in Data Loss Prevention, Insider Risk Management and eDiscovery and Communication Compliance.
- Breakout: Secure and govern your data with Microsoft Purview
- Breakout: Beyond traditional DLP: Comprehensive and AI-powered data security
- Demo: Microsoft Security Copilot: Going Beyond Security Operations
- Pre-recorded: Accelerate risk assessment and incident investigation with AI
5.11.2. Microsoft Security Copilot embedded experience in private preview
The Microsoft Security Copilot embedded experience, available in private preview, will allow IT admins and security analysts to use Microsoft Security Copilot within the Microsoft Intune admin center. Integrating insights and data from security and management tools, Security Copilot provides customized guidance through generative AI to address an organization’s specific requirements, like intelligent policy creation and deployment and faster, easier troubleshooting.
These new endpoint management and security capabilities will be available in Intune, joining the existing security incidence investigation and device security posture improvement experiences currently available in the Early Access Program for Security Copilot capabilities.
- Breakout: Fortified security and simplicity come together with Microsoft Intune
- Breakout: Modern management innovation shaping endpoint security
- Breakout: Scaling AI across your business with Windows and Windows 365
5.11.3. New auditing capabilities within Microsoft Purview Audit for Copilot interactions
With this new feature, in preview, Microsoft Purview Audit consumers will be able to measure and track when users request assistance from Microsoft Copilot for Microsoft 365 and see the list of assets that were affected when responding to the request. These signals will enable security investigators to determine when content, such as files with sensitive data, were touched during a Copilot for Microsoft 365 interaction.
5.11.4. Security Copilot coming to Microsoft Entra to assist in investigating risks, troubleshooting
Microsoft Security Copilot will also be embedded in Microsoft Entra to assist in investigating identity risks and helping with troubleshooting daily identity tasks, such as why a sign-in required multi-factor authentication. IT admins can ask about users, groups, sign-ins and permissions and instantly get a risk summary, steps to remediate and recommended guidance for each identity at risk, in natural language. Additionally, in Microsoft Entra ID Governance, admins can use Security Copilot to guide in the creation of a lifecycle workflow to streamline the process of creating and issuing user credentials and access rights.
- Breakout: Secure access in the AI era: What’s new in Microsoft Entra
- Breakout: Accelerate your Zero Trust journey with unified access controls
5.11.5. Security Copilot will deliver unified solutions across services, data
Security has always been a notoriously siloed and fragmented function – both from a technological and organizational point of view. Microsoft Security Copilot will help transcend technological and talent boundaries by delivering a unified, efficient and intuitive experience for all professionals that secure organizations, including identity management, device management, data security and compliance professionals.
With the addition of these new scenarios, all in private preview, Microsoft’s customers who rely on our industry-leading Microsoft Entra, Purview, Intune and Sentinel solutions will be able to integrate Security Copilot into their routine tasks and workflows and use Security Copilot to assist in many ways, including:
Identity management (Microsoft Entra): Entra skills are now available in Security Copilot to enable security analysts to discover high-risk users, overprivileged access and suspicious sign-ins that aid in a security incident investigation and assess potential risk.
Device management (Microsoft Intune): New features enable IT admins to generate device policies and simulate their outcomes, gather device information for forensics and configure devices with best practices from similar deployments.
Data protection and compliance (Microsoft Purview): New skills for data protection, compliance and risk management identify data impacted by incidents, generate a summary of data and user risks, analyze documents and surface risks of collusion, fraud and sabotage.
Cloud security posture management (Microsoft Defender EASM and Defender for Cloud): New skills for posture management simplify external attack surface risk assessment and enable security admins to manage cloud security posture more efficiently. Security admins can quickly discover potential attack paths using natural language queries, get mitigation guidance for proactive prevention of threats and receive automatic notification for resource owners.
Embedded experience in the unified SIEM+XDR UX (Microsoft Defender XDR and Microsoft Sentinel): For the first time, the generative AI capabilities of Security Copilot are available in Microsoft’s unified experience across the award-winning Microsoft Defender XDR and Sentinel solutions, accelerating incident response with guided investigation, rapid aggregation of evidence across numerous data sources and advanced capabilities such as malware analysis.
- Keynote: The Future of Security with AI
- Breakout: Microsoft Sentinel: A modern approach to security operations
- Breakout: Security for AI: Discover, protect, and prepare in the AI era
- Breakout: Unifying XDR + SIEM: A new era in SecOps
6. Power Platform 6.1. Power Platform
6.1.1. payment processing in power pages in preview.
Payment processing integration , now in preview in Microsoft Power Pages, will enable makers to easily embed payment processing directly into their websites.
Payment processing platforms power online payment processing and commerce solutions. This unlocks a new use case for Power Pages – building websites that support payment processing – as makers will be able to use Power Pages to develop websites that accept payment from customers. For example, state and local governments will be able to process licensing fees, application fees and permitting fee payments through their Power Pages websites.
- Breakout: Build secure web apps and connect data faster with Power Pages
7. Security 7.1. Defender
7.1.1. microsoft updates help evolve the security operations center experience.
Several key updates across Microsoft’s suite of security solutions are designed to help Security Operations Center (SOC) professionals operate more efficiently and better protect their assets and data. These updates include:
Announcing Microsoft Defender XDR: Microsoft Defender 365 is now Microsoft Defender XDR. The new name best represents Microsoft’s extended detection and response (XDR) capabilities that span beyond products included in the Microsoft 365 suite. The native security solutions protect devices across Windows, Linux, macOS, Android and iOS, as well as multicloud environments spanning Azure, Amazon Web Services (AWS) and Google Cloud Platform (GCP). This update is generally available.
Microsoft Defender XDR and Microsoft Sentinel combine as a unified security operations platform: The unification of Defender XDR and Sentinel into a single, powerful user experience along with the addition of Microsoft Security Copilot generative AI will change how customers manage their security operations and protect their assets and data. Customers will have a high level of efficiency and ease of use with a single experience for their security operations tools. This will mean less clicking, less context switching and less training for more robust insights. With the integration of cutting-edge AI and automation technologies, defenders will be able to level up their skills with guided response across first- and third-party data sets. This update is in private preview.
Embedded Microsoft Security Copilot: Customers using the unified SOC platform will be able to access the benefits of an embedded generative AI tool that will help analysts to level up their security information and event management (SIEM) and XDR skills. Security Copilot will help by using natural language to write keyword query language (KQL) queries, understand malicious scripts, create incident summaries and provide support throughout the investigation and remediation process. This update is in early access.
Optimize data in the SIEM with SOC optimizations: This new feature will support SOCs in ensuring they are maximizing the value of the data they are ingesting into Sentinel with recommendations that will help them to save money, improve coverage and better secure themselves against specific threats. This feature is in private preview.
Improved response to user assets and cloud workloads: Integration of cloud workload alerts, signals and asset information from Microsoft Defender Cloud into the industry-leading XDR platform will help security teams combat cross-domain attacks more effectively. This powerful integration will provide SOC analysts with a holistic view, spanning workspace and cloud infrastructure, plus rich contextual insights to uncover the entire attack story in a single incident. This will protect organizations against advanced attacks with efficiency and speed. This update is in preview.
Auto-deployed decoys: Will provide early-stage, high-fidelity signals that force adversaries to be correct 100 percent of the time with built-in deception techniques harnessing Microsoft Defender for Endpoint’s unique visibility into organizations and OpenAI’s GPT-4 generative AI model. Users will be able to automatically generate and disperse decoys and lures at scale that resemble real users and assets in the organization. This will allow SOC teams to detect and focus on attacks even more effectively. This update is in preview.
Protection of AI apps within an organization: A set of new capabilities across Microsoft Defender and Microsoft Purview will help defenders securely prepare for the new wave of AI and empower their organizations, while keeping their data and other assets secure. Microsoft Defender for Cloud Apps will extend its discovery capabilities to support over 400 large language model apps. Additionally, Purview Data Loss Prevention will help organizations create policies that prevent their users from pasting sensitive data to specific websites. This update is in preview.
- Blog: Learn more about the unification of Microsoft Defender XDR and Microsoft Sentinel .
- Blog: Learn more about updates to Microsoft Defender XDR .
- Blog: Learn more about updates to Microsoft Sentinel .
- Blog: Learn more about updates to Microsoft Security .
- Breakout: Hello unification: A new era in SecOps
- Breakout: Security for AI: Prepare, protect, and defend in the AI era
- Demo: Getting started with Microsoft Sentinel
- Demo: Protect more with Microsoft Sentinel and 365 Defender together
7.1.2. New features for Microsoft Defender for Cloud
Microsoft Defender for Cloud helps organizations protect multicloud and hybrid environments with comprehensive security across the full lifecycle, from development to runtime. Several key updates will help security admins adopt a comprehensive cloud-native application protection strategy and improve security posture across multicloud environments and DevOps platforms and include:
- Unify identity and access permissions insights to improve cloud security posture through integration with Microsoft Entra Permissions Management : Security admins will get a centralized view of Permissions Creep Index, drive least privilege access controls for cloud resources and get proactive attack path analysis that connect the dots between access permissions to other potential vulnerabilities across Azure, Amazon Web Services (AWS) and Google Cloud. This update is in preview.
- DevOps security insights across GitHub, Azure DevOps and GitLab : Security admins will get deep visibility into their application security posture across GitHub, Azure DevOps and GitLab within Defender for Cloud now in preview. In addition to GitHub Advanced Security and GitHub Advanced Security for Azure DevOps, with the preview of the GitLab Ultimate integration, Defender for Cloud will now support the three major developer platforms.
- Improved container security across multicloud environments : Security admins will be able to get ahead of containerized application risks and prioritize misconfigurations and exposures in their Kubernetes deployments with the expansion of Defender Cloud Security Posture Management’s (CSPM) contextual graph-based capabilities to Amazon Elastic Kubernetes Service (Amazon EKS) and Google Kubernetes Engine (GKE) clusters. This update will be in preview soon.
- Enable proactive attack path analysis across clouds and faster risk mitigation : Security admins will be able to reduce recommendation fatigue and efficiently remediate critical risks with a risk-based recommendation enhanced attack path analysis engine to identify and prioritize remediation of more complex risks such as cross-cloud attack paths. New code-to-cloud mapping will also enable security admins to rapidly accelerate the time and effort to address critical security flaws right in the code itself. Additionally, the new ServiceNow integration will enable admins to use their existing system to automate or drive mitigation of risks. This update is in preview.
- Improved API security posture: With the general availability of Defender for APIs plan in Defender for Cloud, security admins can gain visibility of business-critical APIs, prioritize vulnerability fixes and quickly detect active real-time threats for APIs published in Azure API Management. New preview capabilities targeting sensitive data classification powered by Microsoft Purview and curated attack paths will help security admins further safeguard data from API risks.
- Microsoft Security Copilot in Defender for Cloud : Security admins will be able to gain efficiency in discovering and remediating risks with the power of AI-generated guidance. Security admins will be able to easily identify risks and vulnerabilities across their cloud environment using natural language questions. This feature is in private preview.
- Blog: Announcing new CNAPP capabilities in Defender for Cloud
- Breakout: Boost multicloud security with a comprehensive code to cloud strategy
- Demo: Protecting multicloud resources with code-to-cloud CNAPP
7.2.1. Microsoft Entra Permissions Management adding more integrations
The integration of Microsoft Entra Permissions Management with Microsoft Defender for Cloud (MDC), now in preview, will provide an efficient way to consolidate insights into other cloud security posture information on a single interface. Customers will receive actionable recommendations for addressing permissions risks in the MDC dashboard and gain a centralized view of the Permissions Creep Index, facilitating the enforcement of least privilege access for cloud resources across Azure, Amazon Web Services (AWS) and Google Cloud Platform (GCP).
Another integration is with ServiceNow, one of the most popular IT Service Management (ITSM) solutions. Through this integration, customers can request time-bound, on-demand permissions for multicloud environments, such as Azure, AWS and GCP, via the ServiceNow portal. This integration helps organizations enhance their Zero Trust posture by enforcing the principle of least privilege for multicloud permissions and streamlines access permission requests within existing approval workflows.
These updates are now generally available.
- Blog: Identity at Microsoft Ignite: Securing access in the era of AI .
- Product roundtable: Build the Future of your Multicloud Access with Microsoft Entra
7.2.2. Microsoft’s Security Service Edge expands Internet Access and Private Access preview
Microsoft’s Security Service Edge (SSE) solution secures access to any app or resource from anywhere and includes Microsoft Entra Internet Access (Internet Access) and Microsoft Entra Private Access (Private Access). Internet Access will expand its preview to include context-aware Secure Web Gateway (SWG) capabilities for all internet apps and resources. The extended preview of capabilities for Private Access will help make Private Access fully ready for traditional VPN replacement.
Internet Access capabilities, coming soon to preview, include:
- Universal Conditional Access for any internet endpoint from managed devices and networks. By managing all access policies in one place, users will be able to extend adaptive access controls universally to rely on Conditional Access to any network destination, like an external website or non-federated SaaS applications – without the need to change these applications.
- Token theft protection for Entra ID apps through compliant network check-in Conditional Access. With this control, users will be able to protect Entra-integrated cloud applications against token theft and ensure users do not bypass network security policies specific to their tenant while accessing critical cloud services.
- Source IP restoration in Identity Protection and Conditional Access location policies. Internet Access will offer differentiated backward compatibility of trusted location checks in Conditional Access and continuous access evaluation, identity risk detection and logging, by ensuring the users’ original source IP is maintained.
- Context aware SWG will restrict user access to unsafe and non-compliant content with web content filtering (URL, FQDN, web category) and make internet filtering policies more succinct, readable and comprehensive by leveraging the rich user, device and location awareness of Conditional Access.
- Improved security, visibility and user experience for Microsoft 365 will include data exfiltration protection through universal tenant restriction and prevents anonymous access for Microsoft Teams and SharePoint. This capability will provide great performance and resiliency for Microsoft 365 applications and flexible deployment options for Microsoft 365 scenarios with third-party SSE vendors.
Private Access capabilities, in preview, include:
- VPN replacement: Extended protocol support for private app access and, in addition to Transmission Control Protocol (TCP), User Datagram Protocol (UDP) support along with private Domain Name System (DNS) support will be provided. These enhancements will enable customers to seamlessly transition from their traditional Virtual Private Network (VPN) deployments to a fully ready, identity-centric Zero Trust Network Access (ZTNA) solution.
- Multifactor authentication ( MFA) to all on-premises apps: Private Access will provide Conditional Access controls and modern authentication methods, such as MFA, to secure access to all private applications and resources. This will apply to any application, located anywhere, for both remote and on-premises users.
Both Internet Access and Private Access have cross-OS clients (Windows, Android, MacOS, iOS) and increased global presence with more points of presence (POPs).
- Blog: Identity at Microsoft Ignite: Securing access in the era of AI
- Discussion: Microsoft’s Security Service Edge (SSE) solution Q&A
- Demo: Microsoft’s Security Service Edge (SSE) Solution in action
- Product roundtable: Share your feedback on Microsoft’s Security Service Edge solution
7.2.3. Updates for Microsoft Entra ID now generally available
Microsoft Entra ID is an identity and access management solution that connects employees, customers and partners to their apps, devices and data for hybrid and multicloud environments. The following updates are now generally available.
Microsoft managed Conditional Access policies : Microsoft will begin to automatically enroll customers into Conditional Access policies based on their risk signals, current usage and licensing. The policies will enhance the security posture and reduce the complexity of managing Conditional Access.
Microsoft Entra Certificate-Based Authentication (CBA): Microsoft Entra CBA now offers several new features to improve the security posture of customers. These enhancements enable customers to customize authentication policies based on certificates, resource type and user group. Customers now have more control and flexibility to choose certificate strength for different users, combine CBA with other methods for multifactor or step-up authentication and configure authentication strength either tenant wide or by user group.
In addition, in early 2024, Microsoft Entra ID users will be able to sign in with passkeys managed by the Microsoft Authenticator app . By using passkeys, customers will have an additional phishing-resistant credential based on open standards and will ensure access to the latest security enhancements that will be added to the FIDO standard in the coming years.
7.3.1. enriched, high-fidelity security alerts to empower data security teams in preview.
Several key updates, now in preview, will help data security teams manage critical insider risks and include:
Enrich DLP incident management with insider risk insights: Traditional Data Loss Prevention (DLP) solutions generate alerts when certain conditions are met, such as when a user copies confidential files to a USB device. However, these alerts typically only highlight the specific incident and files impacted, without providing context about where the files originated or what other actions the user took.
With a new feature powered by Insider Risk Management, DLP alerts will be enriched with user context, allowing DLP analysts and security operations center (SOC) analysts, with appropriate permissions, to see a summary of past user activities that may have led to potential data security incidents. This is all part of the incident management experience in Microsoft Purview and Microsoft Defender.
For example, with this feature, the abovementioned DLP alert will now contain a summary of the critical sequence of actions taken by the user, showing that they downloaded confidential files from SharePoint, downgraded the sensitivity label and compressed the files into a zip file before exfiltrating them to the USB device. With this context, analysts will be able to better understand the user’s intent and determine whether they were trying to exfiltrate sensitive data while evading detection. This feature will help analysts gain a better understanding of a DLP incident and make faster, more informed decisions on how to respond to potential incidents.
Support administrative units in Insider Risk Management: Different departments or geographic locations may have varying policies or preferences for managing insider risks. Regional regulations may also necessitate distinct designs or processes for implementing insider risk programs. To accommodate these needs, Insider Risk Management will support administrative units. This feature will allow admins with the appropriate permissions to subdivide the organization into smaller units and assign specific admins or role groups to manage only the members of those units. For instance, German insider risk admins can create and manage policies exclusively for German users, while German insider risk investigators can investigate alerts and activities solely from German users.
Get more high-fidelity alerts with recommended policy tuning: When instruments are finely tuned, music can be played smoothly. Similarly, data security solutions can be more effective and powerful when policy configurations are optimized. However, reaching an optimal stage can be time-consuming, as admins may need to experiment with different configurations, and each iteration can take several days to yield results. To address this challenge, Insider Risk Management will provide recommendations and sensitivity analysis to help admins set policy thresholds for certain user activities based on real-time analytics. This feature will save security teams time in fine-tuning policies and enable them to receive an optimal volume of high-fidelity alerts more quickly.
- Blog: Empower data security teams to proactively manage critical insider risks across diverse digital esta
7.3.2. Incident investigation, eDiscovery capabilities added to Microsoft Purview
With Microsoft Purview eDiscovery incident investigation experience , users can accelerate incident investigations by harnessing incident parameters to know where to look for the exact evidence required to assess and mitigate incident risk.
Users will be able to explore incident insights to further understand blast radius and risk levels, mitigate incidents with comprehensive actions to expand the investigation, invite collaboration with investigation stakeholders and manage the evidence to resolve the investigation. This feature is in preview.
With the integration of Microsoft Security Copilot into eDiscovery, users will be able to set natural, language-based queries to find and collect potentially relevant content for investigations. This will ease the burden of using complex queries to search and find content, help identify the relevant source locations and simplify the time-consuming validation of search criteria and results. The power of Copilot will be used to summarize threads, files and content sets within an investigation. Users will be able to act based on the extracted insights to further accelerate the pace of investigation and review. This feature is in preview.
- Blog: The Next Era of eDiscovery: Embracing Advanced Capabilities for a Comprehensive Digital Landscape
7.3.3. New capabilities in Purview Data Loss Prevention
Microsoft Purview Data Loss Prevention (DLP) is a cloud-native solution that proactively prevents accidental or unauthorized loss of sensitive data across apps, services and devices.
Several new capabilities in Purview DLP will help organizations comprehensively protect sensitive data loss, including expanding protections across Microsoft and non-Microsoft platforms, providing additional protection capabilities, as well as additional features that can help DLP and Security Operations Center (SOC) admins be efficient in their day-to-day tasks.
These updates fall under several categories:
Expanding the breadth of protection
- DLP support for Windows on Arm will let customers extend Microsoft Endpoint DLP policies and actions to endpoints running Windows on Arm. This will allow them to detect and protect sensitive data in files within their digital ecosystem. This update is in preview.
- Enhancement of DLP capabilities on macOS endpoints , including the ability to create groups of printers, USBs, network shares and sensitive service domains and apply different restrictions to each group, with the ability to apply the most restrictive action against multiple DLP rules. This update is generally available.
Expanding the depth of protection
- Just-in-time protection for endpoint DLP that restricts activity at the time of egress is generally available.
- Performance improvements for enforcing restrictions on sensitive content shared over Microsoft Teams Chat are generally available.
Empowering admins to be efficient
- Enriching DLP alerts with user insights from Purview Insider Risk Management will bring in the user context within the DLP alert for efficient investigations. This update is in preview.
- The ability to store original files, resulting in a DLP policy match as evidence for investigations. This update is in preview.
- Richer filter for DLP alerts in Microsoft 365 Defender, including file name, file path and latency, helping admins get more out of the alerts. This update is in preview.
- Simulation mode for DLP policies to enable admins to try a Microsoft Purview DLP policy, assess its impact and fine-tune the policy as required in an isolated environment. This will help admins build confidence in the configuration of the policy and reduce the policy enforcement time. This update is in preview.
- DLP recommendations for highlighting current risks in the organization’s environment, quick policy setup to mitigate the risk and policy fine-tuning to make existing policies better and reduce noise. This update is generally available.
- Support for admin units for DLP alerts in Microsoft 365 Defender is generally available.
- Blog: Gain comprehensive data protection and efficient investigation with Microsoft Purview DLP
7.3.4. New features for Microsoft Communication Compliance
Microsoft is introducing cutting-edge AI and Microsoft Security Copilot capabilities to bolster Microsoft Communication Compliance. These advanced features will leverage AI and Security Copilot technology to improve the detection and management of compliance issues in communication. This marks a significant step forward in ensuring organizations can harness the power of AI to maintain secure, productive and accountable communication while staying compliant with regulations. These updates include:
- Communication Compliance document and alert summaries with Security Copilot : This embedded Copilot experience will simplify communication policy matches, offering contextual summaries, conversation evaluation, risk network identification and intelligent triage. This combination will enhance data security and compliance measures for organizations. This update is in private preview.
- Microsoft Teams compliant meetings: Users can leverage voice-to-text machine learning to convert Teams meeting recordings to text, analyze transcripts for potential risky content and display video snippets of policy matches to facilitate triage. This update is generally available.
- Microsoft Viva Engage message reporting : Microsoft Viva Engage users are empowered to report inappropriate or concerning posts and comments within Viva Engage conversations. Reported messages then get triaged within Communication Compliance. This update is generally available.
- Next-generation business conduct detection: Using Azure AI Content Safety, new detections are being added to Communication Compliance that will enable teams to build safer online environments by detecting potential violence, hate, sexual and self-harm content, then assigning severity scores to unsafe text across languages. These classifiers are built using large language models. This update is in preview.
- Blog: Unleash the Future of Communication Compliance at Microsoft Ignite 2023
7.3.5. New features in Purview Information Protection now generally available
Microsoft Purview Information Protection helps organizations understand what data is sensitive and business critical and then how manage and protect it. New features that are now generally available include:
New contextual predicates in service-side auto-labeling: This is a novel way of auto-classification based on document properties, such as the type of file (file extension), size, who the document was created by and if the document name contains certain words or phrases. This enables organizations to intelligently discover and label groups of documents which, according to these contextual predicates, may contain sensitive data.
Auto labeling (for files at rest in SharePoint Online) can now label PDF files: Auto labeling for files at rest in SharePoint Online will automatically start labeling PDF files, a widely used file type, in addition to currently supported Word, Excel and PowerPoint files.
Application of a default sensitivity label for a SharePoint document library: With this update, all newly uploaded documents to a document library can inherit the configured label for the document library (if not already labeled). All documents, either newly created or modified, in that library will be automatically assigned with that library’s label. Site default labels allow users to protect all documents because the library itself is sensitive – without needing to define classification policies.
Secure collaboration on labeled and encrypted documents with user-defined permissions: With user-defined permissions, document owners no longer need admins to create special labels for their highly confidential documents. Instead, they can specify the permissions themselves by applying user-defined permissions (UDP) labels on files. UDP-labeled files in SharePoint support co-authoring. This capability is very popular with C-Suite users and users working on tented projects who need to limit access to highly confidential documents to a small set of explicitly authorized individuals that they select.
Sensitivity labels to protect Microsoft Teams shared channels: This release is about safeguarding the confidentiality of Teams channels by enabling users to discover private teams that they weren’t able to previously view and/or join. Shared channel access controls enable users to apply settings, such as internal only, same team only or private team only on shared channels to better secure confidential information.
Microsoft Fabric support for sensitivity labels: Fabric support for sensitivity labels follow the data automatically as it flows from the lakehouse to Power BI reports, Microsoft 365 files and other assets business users rely on every day, for end-to-end, comprehensive protection.
Configure policy tips as popups for labeled emails and attachments: Admins can now configure data loss protection (DLP) rules that display warnings in pop-up dialogs before users send emails. This makes it more difficult for users to inadvertently overshare or send emails to external users who aren’t authorized according to their organization’s policies. Admins can set up rules to provide warnings only, block actions entirely, require business justification or request explicit acknowledgements before sending emails.
Double-key encryption to protect sensitive files and emails in Microsoft 365 apps on Windows: To protect the most sensitive content, users of Microsoft 365 apps can now use Double Key Encryption (DKE) for files and emails with built-in labeling in Office. With DKE, Microsoft stores one key in Microsoft Azure and the user holds the other key, so that only the user can ever decrypt protected content. Sensitivity labels configured with DKE in Purview Compliance portal are now available for users in Word, Excel, PowerPoint and Outlook to publish or consume content protected with DKE.
Tracking and revocation in Compliance portal: The tracking and revocation feature enables users to check who has tried accessing their sensitivity labeled and encrypted Office files and revoke access when needed.
In addition, users will be able to e xtend sensitivity labels to assets in Azure , now in gated preview. With Purview, users will be able to extend the reach of Information Protection sensitivity labels and the value from built-in sensitive information types to a much broader set of data locations and data types. Users will be able to work with existing sensitivity labels or create new ones via the Purview compliance portal to extend security and compliance intent to data assets in Azure.
- Blog: Insightful and intelligent classification and protection are key to data security
- Pre-recorded: How Microsoft Purview helps you protect your data
- Product roundtable: Understand your risk and protect your most sensitive data
7.3.6. Secure data in the AI era with Microsoft Purview
Microsoft Purview can help secure data in AI and help organizations adopt AI, including Microsoft Copilots and non-Microsoft AI applications. These capabilities can provide organizations:
- Visibility into the risks associated with sensitive data usage and user activity context in AI applications in their environment.
- Comprehensive protection with ready-to-use policies to prevent sensitive data loss in AI.
- Compliance controls to help easily meet business and regulatory requirements and detect code of conduct and business violations.
Purview capabilities will be integrated with Microsoft Copilots, starting with Microsoft Copilot for Microsoft 365 integrations, which are now generally available. With Copilot for Microsoft 365 customers will be able to:
- Discover data security risks, including sensitive data shared with Copilot and risky use of Copilot.
- See Copilot honoring Purview Information Protection sensitivity label access restrictions, inheriting sensitivity labels from referenced files and citing sensitivity labels in responses.
- Capture Copilot prompts and responses as evidence in Purview Audit.
- Run content search in Purview eDiscovery.
- Manage retention and deletion policies for Copilot prompts and responses in Purview Data Lifecycle Management.
- Detect business and code of conduct violation in Copilot prompts and responses in Purview Communication Compliance.
- Blog: Securing data in an AI-first world with Microsoft Purview
7.3.7. Unified Microsoft Purview portal expands sphere of protection
Organizations must manage multiple solutions to help discover, protect and detect risks surrounding their most sensitive data. To help change the way organizations secure their data, Microsoft Purview is providing one unified Microsoft Purview portal, now in private preview, with these additional features:
Data visibility across all environments: Purview is expanding horizons and enabling visibility of sensitive data beyond Microsoft and into other clouds and applications, making it easier to track and store in a safe place.
Lifecycle protection: Organizations will be able to apply labels on sensitive data – just like they do for applications like Microsoft Outlook or Teams – but will be able to do so in databases, like SQL or other clouds like Amazon S3. This ensures the labels and sensitive information types can transfer across environments, ensuring a unified layer of protection.
Multicloud detection in Insider Risk Management: Data doesn’t move itself, it’s people who move and interact with data, and that’s where the majority of data security risks stem from. As users within organizations use multiple applications and cloud services in their day-to-day work, security teams must comprehend the risks associated with these user activities that may lead to potential data security incidents. Insider Risk Management will add detections in clouds like Azure and Amazon Web Services (AWS), as well as in applications like Box, Dropbox, Google Drive and GitHub. Admins will be able to incorporate these multicloud detections in their data leak and data theft playbooks, making the insights more comprehensive.
- Pre-recorded: Protect your entire data estate across multiple clouds
8. Windows 8.1. Windows Commercial & Enterprise
8.1.1. new features coming to windows 365 and azure virtual desktop.
A host of new features for Windows 365 and Azure Virtual Desktop include:
- Windows App , in preview, will be the place to connect to any devices or applications across Windows 365, Azure Virtual Desktop, Remote Desktop, Remote Desktop Services, Microsoft Dev Box and more.
- Windows 365 GPU support , in preview, will make it ideal for workloads, such as graphic design, image and video rendering, 3D modeling, data processing and visualization applications.
- Windows 365 AI capabilities will help customers reduce costs, increase efficiency and further simplify security and management of Windows 365 Cloud PCs. One example will be applying AI to assess Cloud PC deployment and utilization to provide recommendations to help organizations better forecast and right-size their Cloud PC investment. This capability will be in preview in the coming months.
- Azure Virtual Desktop Autoscale for personal desktops is the Azure Virtual Desktop native scaling solution that automatically starts session host virtual machines (VMs) according to schedule or using Start VM on Connect. It then deallocates or hibernates session host VMs based on the user session state (log off/disconnect). The deallocating capability is now generally available, and hibernating session host VMs is in preview.
- Single-sign on (SSO) and passwordless authentication support for both Windows 365 and Azure Virtual Desktop is now generally available for Azure Virtual Desktop and Windows 365, along with third-party identity provider (IDP) support. Microsoft is also actively working on enabling the same capabilities for Azure Virtual Desktop approved providers.
- Windows 365 Customer Lockbox , in preview, will ensure that Microsoft support engineers can’t access content to do service operations without explicit approval.
- Windows 365 Custom Managed Keys will allow organizations to encrypt their Windows 365 Cloud PC disks utilizing their own encryption keys. This feature will be in preview soon.
- Watermarking, screen capture protection and tamper protection support for both Windows 365 and Azure Virtual Desktop are now generally available, protecting against unauthorized access and manipulation of data, keeping sensitive information protected and maintaining organizational data integrity.
- Blog: Learn more about these updates
- Breakout: Scaling AI to every aspect of your business with Windows
- Demo: Windows 365 and Azure Virtual Desktop: when, where, why, and how
- Discussion: Windows 365 and Azure Virtual Desktop Q&A
- OnDemand: The Windows Cloud experience
8.1.2. Universal Print moves function to the cloud
The last workload to move to the cloud is print. Universal Print delivers a complete service integrated with all of Microsoft 365 and Windows 365 to simplify print for employees and IT professionals.
With the added support for MacOS endpoints and an easy-to-use pull print functionality, in preview, employees will be able to securely and conveniently print on any corporate printer from anywhere and from any device. Now in preview, MacOS devices are fully supported through Universal Print. Users can print from any device or walk up to any corporate printer and securely release their print job, without having to choose the printer.
8.1.3. Windows Autopatch simplifies and automates update management for Windows
Microsoft is extending Windows Autopatch to PCs for frontline workers by adding Autopatch to the Microsoft 365 F3+ subscription. This long-awaited extension of the update management service is now generally available.
In addition, Windows Autopatch will become the unifying Windows update management solution for enterprise customers. The single solution for enterprise update management can be used as a fully automated managed service to deliver the highest cost savings, allowing customers to reallocate resources to higher-value areas. Used from within Microsoft Intune, Windows Autopatch becomes a do-it-yourself, partially automated service that delivers a high level of control. Windows Autopatch is the solution to update and upgrade Windows devices, Microsoft 365 apps, Microsoft Teams and Microsoft Edge. Over time, Windows Update for Business deployment service will merge into the single service for enterprise customers.
Finally, Microsoft is adding firmware and driver update management granular controls to Windows Autopatch. This feature is now in private preview.
- Discussion: Windows 11, Windows 365, & Microsoft Intune Q&A
8.2. Windows Developer
8.2.1. introducing windows ai studio and new features for dev home and wsl.
Windows AI Studio for developers will simplify generative AI application development, bringing together cutting-edge AI tools and a model catalog. This will enable developers to fine-tune, customize and deploy small language models (SLMs) for local use in their Windows applications – all in one place.
Windows AI Studio , available in the coming weeks, will give developers greater choice to either run their models on the cloud on Azure or on the edge locally on Windows to meet their needs. And soon, Windows AI Studio will include prompt flow capabilities for prompt orchestration.
Dev Home with Azure Dev Ops (ADO) integration is now available in preview. Dev Home in Windows is a new feature that will help customers onboard new team members and projects faster through WinGet Configuration, in addition to managing projects and daily tasks with a customizable dashboard. An ADO extension , now available for Dev Home, helps enterprise developers easily clone their Azure repositories in Dev Home, manage ADO projects and stay on top of queries and relevant tasks from Dev Home.
Windows Subsystem for Linux (WSL) has three new features targeted at enterprise-use cases that are generally available and include:
- A new plugin for WSL , released by Microsoft Defender for Endpoint, which enables security teams to continuously monitor for events in all running distributions – delivering clear visibility into systems once considered a critical blind spot.
- New integrations with Microsoft Intune , which lets admins control access to WSL and its key security settings.
- Networking improvements that add additional security by enforcing the firewall rules on Windows to affect the WSL distributions and improved compatibility with VPNs and proxies in a corporate environment.
- Blog: Learn more about Windows Subsystem for Linux .
- Demo: Dev Home: Your new companion for productivity
Please enter your information to subscribe to the Microsoft Fabric Blog.
Microsoft fabric updates blog.
Microsoft Fabric November 2023 update
Data science, data warehouse.
- Monthly Update
Welcome to the November 2023 update.
We have lots of features this month including Narrative visual with Copilot, cross workspace “save as” for Data Factory, the general availability of Semantic model scale-out, and many more. Continue reading for more details on our new features!
Microsoft Fabric User API
On-object Interaction Updates
- Azure Maps visual now aggregates multiple data points at the same location
Datasets renamed to semantic models
- Edit your data model in the Power BI Service – Updates
Azure Resource Graph (New Connector)
Profisee (connector update), bloomberg enterprise data and analytics (connector update), dremio (connector update), celonis (connector update).
- Advanced Filtering for Paginated Reports
Editor’s pick of the quarter
New visuals in appsource, zebra bi tables 6.6: introducing text columns, funnel chart by powerviz, create interactive timelines with full control.
- Enhanced accessibility in paginated reports authored in Report Builder
Skill up on Fabric with the Microsoft Learn Cloud Skills Challenge
Dynamic dataset binding for paginated reports, query insights, fabric warehouse publishing full dml to delta lake logs, automatic data compaction for fabric warehouse, fabric warehouse support for sp_rename, improvements to csv data ingestion, fabric enables you to read multi-tb results from warehouse, blazing fast compute resource assignment is on, ssd metadata caching, fabric sql support for trim and generate_series, time-travelling through data: the magic of table clones, rest api support for warehouse, sqlpackage support for fabric warehouse, user experience improvements, dynamic data masking for fabric warehouse & sql analytics endpoint, accessibility support for lakehouse, enhanced multitasking experience in lakehouse, upgraded datagrid capabilities in lakehouse, sql re-provisioning support in lakehouse, runtime 1.2 (apache spark 3.4, java 11, delta lake 2.4), multiple runtimes support, delta as the default table format in the new runtime 1.2, intelligent cache, monitoring hub for spark enhancements, monitoring for lakehouse operations.
- Spark application resource Usage Analysis
- Rest API support for Spark Job Definition (preview)
Rest API support for Lakehouse artifact, Load to tables and table maintenance
- Lakehouse support for git integration and deployment pipelines (preview)
Embed a Power BI report in Notebook
Mssparkutils new api – reference run multiple notebooks in parallel, notebook resources .jar file support.
- Notebook Git integration (preview)
- Notebook in Deployment Pipeline (preview)
- Notebook REST APIs (preview)
- Environment (preview)
- Synapse VS Code extension in vscode.dev (preveiw)
- Copilot in notebooks (preview)
- Custom Python Operations in Data Wrangler
- Data Wrangler for Spark DataFrames (preview)
MLFlow Notebook Widget
New model & experiment item usability improvements, recent experiment runs, models renamed to ml models, release of synapseml v1.0, train interpretable explainable boosting machines with synapseml, prebuilt ai models, reusing existing spark session in sparklyr, rest api support for ml experiments and ml models, new data science happy path tutorial, expansion of data science samples, new data science forecasting sample, delta parquet support in kql db, open source connectors for kql db, new and improved get data experience for real time analytics, rest api support for kql database.
- Splunk Add-on (preview )
- Event Streams is Now Generally Available
- Event Streams Data Transformation for KQL Database (Generally Available)
- Get Data from Event Streams Anywhere in Fabric
- Create a Cloud Connection within Event Stream s
Two Ingestion Modes for Lakehouse Destination
Optimize tables before ingesting data to lakehouse, general availability of fabric connectors, automatic refresh cancellation, error message propagation through gateway, support for column binding for sap hana connector, staging artifacts will be hidden., new power query editor style.
- Support for VNET Gateways (preview)
Cross workspace “Save as”
- Dynamic content flyout integration with Email and Teams activity
- Copy activity now supports fault tolerance for Fabric Data Warehouse connector
MongoDB and MongoDB Atlas connectors are now available.
- Microsoft 365 connector now supports ingesting data into Lakehouse (preview)
Multi-task support for editing pipelines in the designer from different workspaces
String interpolation added to pipeline return value.
We’re happy to announce the Public Preview of Microsoft Fabric User APIs. User APIs are a major enabler for both enterprises and partners to use Microsoft Fabric as they enable end-to-end fully automated interaction with the service, enable integration of Microsoft Fabric into external web applications, and generally enable customers and partners to scale their solutions more easily.
The set of APIs we’re introducing today includes the following, while there are a lot more to come very soon:
More details can be found in Using the Microsoft Fabric REST APIs
A sample can be found here
In this update, we are beyond excited to finally show you Power BI’s latest innovation – the Button slicer, the ultimate tool for slicing data with ease and style! This is just the first step in a thrilling 5-stage journey that will revolutionize your data exploration experience, replacing the old tile slicer with a sleek, flexible alternative that lets you customize your insights and make data-driven decisions faster than ever! Brace yourself for the November upgrade, as it redefines customization and user-friendliness.
With so many upgrades and enhancements in the new Button slicer, we continue to go beyond the improvements you saw in the new card visual, and the following list of features will have you on the edge of your seat, so let’s have a look!
- Customize the shape and appearance of your design with more control to modify the corner radius.
- With an adjustable grid layout, you can now divide your design into rows and columns , and even use pixels to customize card spacing.
- When grid layouts have more buttons than rows and columns, use overflow styles , and choose Pagination or Continuous scrolling with vertical or horizontal direction – a big step forward in design flexibility!
- Just like the new card visual, the button slicer revolutionizes alignment and format properties , and the Label feature will spotlight crucial information from within your buttons.
- Image control allows you to infuse buttons with images creating endless possibilities with URL images.
- Interactive states make your designs more engaging with options including on hover, on press, and selected to engage users with responsive interactivity!
- Get ready for the new formatting settings that will open a world of versatility and a new era of data visualization!
- Single select has been updated with a Force Selection toggle, and a new Select All option!
- The new multi-select feature is a significant step forward in user convenience and efficiency!
- A new Tooltips experience awaits you as the new button slicer now supports both default and report page tooltips to enrich your data visualization skills.
With this preview update, the new Button Slicer feature has been toggled ON by default for your convenience and is found in either the Visual gallery on the ribbon, or by selecting Build a visual after right-clicking on the canvas, and then choosing the new slicer, or lastly, by selecting new slicer from the on-object dialog . You can also toggle this feature preview ON or OFF by looking under Options > Preview features > New button slicer .
Remember, this was only the first stage on the new slicer roadmap. So, fasten your seatbelt, and get ready for the exciting journey ahead as our next stage unfolds and reveals even more Power BI updates, enhancements, and features! Coming next, the List and dropdown slicer!
To learn more about the New button slicer read our blog post here.
Just like the Button slicer update mentioned above in this month’s Feature Summary, we’re equally excited to share Power BI’s second exhilarating advancement – Reference labels, a versatile tool for adding custom labels to new cards, providing relevant information, comparisons, key metrics, benchmarks, goals, and more, in an appealing and concise manner! As the second step in our 5-stage new card visual journey, this November upgrade will keep you smiling from ear to ear!
With Reference labels offering an abundance of creative possibilities, you and your users will be astounded by the world of wonder they unlock. Here’s an overview of the available features!
- Not only can you Reference labels to your card visual, but you even add multiple data fields to your Reference labels.
- With three main components, Title, Value, and Detail , you can also choose Custom content using a different data field or measure and apply various styles to both Title and Value.
- Extra context is available for the Detail component , showing data from a single data field well, but also customizable with styles and colors.
- With an active Divider , Reference labels have their own area where you can modify the divider line and use color to differentiate callout and reference labels.
- Reference labels can have a horizontal or vertical layout and can have custom spacing and padding.
With this preview update, the new Reference Labels feature has been toggled ON by default for your convenience and is found in either the Visual gallery on the ribbon, or by selecting Build a visual after right-clicking on the canvas, and then choosing the new slicer, or lastly, by selecting new slicer from the on-object dialog . You can also toggle this feature preview ON or OFF by looking under Options > Preview features > Reference labels .
With the first and second stage of the new Card visual now delivered, you can imagine what exciting features are waiting for you as we continue this journey together.
To learn more about the Reference labels read our blog post here.
Enhance your Q&A visual with suggested synonyms from Copilot
The Q&A visual allows you to ask questions about your data and get answers in the form of visual. It provides any report viewer with an intuitive means to explore their data without requiring deeper knowledge of the model or report authoring.
Today, the Q&A visual doesn’t rely on generative AI to function. The Q&A engine processes your natural language input all inside Power BI algorithmically using a variety of linguistic principles, associating words, and phrases you use with data in your model. This makes it good at answering precise questions about your data, but it may not be able to associate everything you input with data in the model. To help authors ensure that the Q&A visual provides consistent and accurate answers based on the unique language their report consumers use, we introduced Q&A setup tools with an emphasis on providing Q&A with synonyms for column and table names in the model. This way, authors can explicitly define different ways people might refer to their data, and users will always receive the correct answers when they ask similar questions in the future.
Power BI recognizes two types of synonyms: approved synonyms and suggestions . Approved synonyms either come directly from the names of fields themselves or are explicitly added by the author. When you use an approved synonym in your Q&A input, it will be treated just as though you used the name of the field and the association will be presented with high confidence, signified by a solid blue underline.
Suggested terms are words Power BI thinks are likely to refer to their corresponding name. They come from a variety of sources – synonyms from the Office thesaurus show up by default, but you can also connect to your organization’s collection of approved terms and add those to your suggestions as well. Suggestions will still be used by Q&A, but with lower priority than approved synonyms, and the lower confidence will be signaled in the results with a dotted orange underline. In the Q&A setup menu, suggestions can be added to the approved synonyms list or removed entirely.
Managing synonyms is therefore an important part of improving the quality of the Q&A experience. However, coming up with synonyms for every data entity in your model can be mentally laborious and physically time-consuming. Copilot for Power BI streamlines this process by generating some for you!
If you have Copilot enabled, there are a few ways for you to get suggestions from Copilot. But first, you’ll have to enable the feature in Power BI Desktop in File > Options > Preview features > Improve Q&A with Copilot.
Then, you might be prompted to add synonyms with Copilot via a banner that shows up the first time you make a Q&A visual or open the Q&A setup menu:
You’ll also be able to get Copilot suggested synonyms via the Q&A setup menu. You can turn on Copilot as a source in the suggestion settings menu in the synonyms tab, then hit apply to get synonyms. Or, if Copilot is already enabled as a source, you can click the refresh button next to the suggestion settings dropdown.
After you’ve gotten these suggestions, you might be prompted to review them. You’ll find the new synonyms in the suggestion’s column in the synonyms page of the Q&A setup menu:
Copilot-suggested synonyms will function just like any other suggested synonyms. This means that they may be used by Q&A as a fallback when trying to determine what data fields a natural language input may refer to. Carefully review them in the suggestion’s column of the Q&A visual, remove the synonyms which are inaccurate, and approve the ones which best fit the data.
Keep in mind that as we scale out Copilot, you might run into throttling, which may cause Copilot to return incomplete results if you send too many requests in a short period of time. If that happens, you can wait a bit and try again. Copilot may also not return results for terms for which it cannot generate synonyms, or when its results are deemed inappropriate by our content filter.
As we mentioned in our release of linguistic relationships for Q&A , we see our investment in both Copilot and the Q&A visual as mutually beneficial. There will be more features coming soon, so keep an eye out on the new ways in which we’re bringing the two together!
“Always open in new pane” setting:
Most requested, this month we are bringing you the ability to configure your pane switcher to stack panes instead of swap. If you preferred the behavior from before where panes opened side-by-side by default, you can now configure this setting by checking the new option for “ always open in new pane ” from either the Options menu or the View ribbon.
To achieve the stacked behavior of panes as before:
Turn on the new option within the Options menu:
Or select the new option from the View ribbon:
Resizing the data flyout:
Also highly requested, this month we’ve also added the ability to resize the data flyout (second flyout) from the build button when working with long field names.
Note we have a known bug : there may be cases where the resize handles appear to the left of the data flyout if there is not enough space to expand on the right side. We’re working on a fix! As a workaround in these cases, you can move the visual to left temporarily on the canvas to resize the data flyout.
The table’s add button is back!
The table has the add button again! Originally, we had to remove the add button from the Table visual type as currently the only chart element to add from this menu was the title that does not have a default value. This added confusion to the experience because simply turning on the title did not appear to have changed anything in the visual, and users had to go to the format pane to type in their visual title. Last month we shipped placeholder text which allowed us to bring back the add button for Tables. Now, when turning on title, a placeholder will appear to type directly onto the visual.
Azure Maps visual now aggregates multiple data points at the same location
Previously, when you had multiple data points with the same latitude and longitude field values, those points would be plotted separately, leaving them drawn one over the other at the same location. This could lead to some unclear data visualizations. For example, grouping the points by category using a legend field might leave just one category visible per location due to the overlap. This behavior was also different from visualizing those points using location names, which would aggregate the points together.
With this release, Azure Maps now aggregates points with the same latitude and longitude values in the same way that it does with location names, allowing you to see them as one bubble. These aggregated points can then be filtered or grouped as you would normally.
Narrative visual with Copilot
We’re excited to bring Copilot’s unique ability to summarize data to the rebranded Narrative with Copilot visual – formerly Smart Narratives. This visual allows you to use Copilot to summarize data across your report, or even specific pages or visuals you choose. We offer suggested prompts to get authors started, such as “Give me an executive summary,” “Answer likely questions from leadership,” and “created a bulleted list of insights.” Users can also type in their own custom prompts and questions that return summaries about their data.
Users have the ability to choose whether they want to summarize the entire report, select pages, or even specific visuals across their report, giving them flexibility in what their summary looks like. Users will also see references for each portion of the summary that align to visuals on the various pages on the report, from which the summary lines were generated, making it easy to validate the summary’s accuracy and tie it back to the data.
The summary can be updated as the data is sliced and diced, so end users can interact with it, without editing the prompts themselves.
The narrative visual with Copilot makes it faster to communicate insights about the data that matters to you. The visual is available in the service and in Power BI Desktop.
We’re excited to share that datasets are being renamed to semantic models in Power BI and Fabric. This is necessary for disambiguation from other Fabric items and will make the product clearer and more usable. You will see this change in the most prominent UI elements in the product, and the documentation is being updated. APIs are not currently affected, and we will progressively roll out further changes. The timing of this change is driven by the general availability of Fabric and aligns with the rename for ML models. It reflects the immense progress that Power BI datasets have made in becoming an enterprise-grade semantic modeling technology. The semantic model’s name will help drive awareness of the unparalleled capabilities provided.
Power BI semantic models support for Direct Lake on Synapse Data Warehouse
We are delighted to announce that semantic models can now leverage Direct Lake mode in conjunction with Synapse Data Warehouses in Microsoft Fabric. Synapse Data Warehouse is the first transactional data warehouse to natively support the open data standard of Delta-Parquet in OneLake for seamless interoperability across all Fabric and Power BI workloads and the Spark ecosystem. By default, all tables and views in the Warehouse are automatically added to a default Power BI semantic model so that you can readily query the data with DAX and MDX in addition to T-SQL and Spark . For more details, see Default Power BI datasets in Microsoft Fabric in the product documentation. Of course, you can also build custom semantic models on top of a Synapse Data Warehouse.
Up until now, semantic models were only able to query Synapse Data Warehouses in DirectQuery mode. This has now changed! Default and custom semantic models can now operate in Direct Lake mode, as the below screenshot highlights. Direct Lake mode is a groundbreaking new data access technology for semantic models based on loading Delta-Parquet files directly from OneLake without having to import or duplicate the data. Especially when dealing with large data volumes, Direct Lake combines the advantages of DirectQuery and Import modes to deliver blazing-fast query performance without any data movement. That’s why we are so excited to introduce support for Direct Lake semantic models on top of Synapse Data Warehouses. For more information, see Learn about Direct Lake in Power BI and Microsoft Fabric in the product documentation.
With Direct Lake mode support , default semantic model s and n ew custom semantic model s on top of Synapse Data Warehouse s will operate in Direct Lake mode out of the box. You don’t need to take any action. Existing custom semantic model s, however, might have to be recreated or manually converted to Direct Lake mode using XMLA-based editing tools or SQL Server Management Studio as in the screenshot above. In order to verify that a semantic model is indeed leveraging Direct Lake mode for query processing, refer to the article Analyze query processing for Direct Lake datasets in the product documentation.
DAX query view to write and run DAX queries on your model
Quickly explore and analyze your semantic model with DAX queries. The DAX query view is a fourth view in Power BI Desktop which allows you utilize the powerful DAX query language using EVALUATE to discover, analyze, and see the data in your semantic model. Similar to the Explore feature above for the Power BI service, model authors can quickly validate data and measures in their semantic model without having to build a visual, publishing, or using an additional tool. Changes made to measures can be updated back to the semantic model. DAX queries are different from DAX expressions used to create model items such as measures, calculated columns, and calculated tables, and are more like SQL queries returning data in a table.
This powerful way to interact with your data model is now available in the new DAX query view . We give you several ways to be as productive as possible.
- Quick queries can generate DAX queries for you in the Data pane context menu of tables, columns, or measures gives you a head start by generating a DAX query to preview data or show summary statistics. Use DAX queries to help you understand the data without creating visuals and for DirectQuery you no longer must go back to Power Query to see some sample data.
- Quick queries to get the DAX expression of all, some, or a specific measure in a generated DAX query. It provides the DEFINE block with the measure’s DAX expression and an EVALUATE block to see the measure’s output. You can then add to the Dax query with any additional group by columns.
- Update the model options will be available to you with any DAX query updates to measures in DAX query’s DEFINE block.
- Measures that reference other measures can now be seen on the same screen and updated together. Preview the output of your changes then also update the model when ready.
- Report view’s Performance Analyzer already lets you copy the visual DAX query. Now you no longer need to take that DAX query and use another tool to view and run it – simply run it in DAX query view.
- All these features in a bigger DAX query editor similar to VS Code, including more keyboard shortcuts and the ability to improve readability by formatting any DAX query .
We plan to continue to add functionality to the DAX query view, so your feedback will be critical. We have previously shown a vision demo with a DAX query view copilot, and saw the excitement for this copilot, and it will be coming soon. We also have planned investments in bringing DAX query view to live connect reports and to the Power BI service, as well as building on it to give more insight into debugging and performance of DAX. Learn more about DAX queries at aka.ms/dax-queries and get started today by turning on this public preview feature in Options > Preview features .
Edit your data model in the Power BI Service – Updates
The new data model editing in the Service feature was released to preview in April. We’ve been busy reacting to your feedback and enhancing the experience. Below are the improvements coming this month:
Mark as date table
Within the Service, you can now mark a table in your data model as a date table . Marking a date table in your model allows you to use this table for various date-related elements including visuals, tables, quick measures, and more, with full Time Intelligence support . To set a date table in the Service, right-click on the desired table and choose ‘Mark as date table > Mark as date table’ in the menu that appears.
Next, specify the date column by selecting it from the dropdown menu within the ‘Mark as date table’ dialog. Power BI will then perform validations on the selected column and its data to ensure it adheres to the ‘date’ data type and contains only unique values.
Rename and delete tables and columns
Within the Service the following functionality is now supported:
- Renaming and deleting any table
- Renaming and deleting any column
Please continue to submit your feedback directly in the comments of this blog post or in our feedback forum .
Multiple or Empty selections
If the user makes multiple selections on the same calculation group, the current behavior is to return the same result as if the user did not make any selections. In this preview, this behavior is going to change and instead we will return no results if you did not specify a multiOrEmptySelectionExpression on the calculation group. If you did, then we evaluate that expression and related dynamic format string and return its result. You can, for example, use this to inform the user about what is being filtered:
‘MyCalcGroup'[Name] = “item1” || ‘MyCalcGroup'[Name] = “item2”
— multipleOrEmptySelectionExpression on MyCalcGroup:
IF(ISFILTERED ( ‘MyCalcGroup’ ),\”Filters: \”& CONCATENATEX (FILTERS ( ‘MyCalcGroup'[MyCalcGroup] ),’MyCalcGroup'[MyCalcGroup],\”, \”))
ISFILTERED ( ‘MyCalcGroup’ ),
& CONCATENATEX (
FILTERS ( ‘MyCalcGroup'[Name] ),
— Returns “Filters: item1, item2”
In case of a conflicting or empty selection on a calculation group you might have seen this error before:
With our new behavior this error is a thing of the past and we will evaluate the multipleOrEmptySelectionExpression if present on the calculation group. If that expression is not defined, we will return no results.
One of the best showcases for this scenario is automatic currency conversion. Today, if you use calculation groups to do currency conversion, the report author and user must remember to select the right calculation group item for the currency conversion to happen. With this preview, you are now empowered to do automatic currency conversion using a default currency. On top of that, if the user wants to convert to another currency altogether, they can still do that, but even if they deselect all currencies the default currency conversion will still be applied.
Note how both the currency to convert to as well as the “conversion” calculation group item is selected.
Notice how the user only has to select the currency to convert to.
Read more about selection expressions in our calculation groups documentation.
The selection expressions for calculation groups are currently in preview. Please let us know what you think!
We are excited to announce the release of the new Azure Resource Graph connector! Please find below release notes from the Azure Resource Graph team.
Empower your data insights with our cutting-edge Power BI data connector for Azure Resource Graph! Now, seamlessly transform your Azure Resource Graph queries into stunning visualizations within Power BI. Elevate your analytics game and make data-driven decisions with ease. Unlock the synergy of Azure Resource Graph and Power BI today!
The Profisee connector has been updated. Below are updated release notes from the Profisee team.
Profisee’s Power BI Connector Version 3.0 exposes new information around data quality, enhancing analytics on data quality improvements for data managed in Profisee with detailed information regarding validation issues flagged within their data. Additionally, the data types have been refined streamlining the experience for users using the Profisee Connector to load data from Profisee to Microsoft Fabric using Data Factory Gen 2 Data Flows.
The Bloomberg Enterprise Data and Analytics connector has been updated. Below are update release notes from the Bloomberg team.
This version of the Bloomberg Data and Analytics connector for Power BI includes changes to support underlying infrastructure updates and back-end performance enhancements. All user-facing features remain unchanged.
The Dremio connectors have been updated to support additional optional values and parameters.
The Celonis EMS connector has been updated with minor changes.
Within Power BI, many times users need to perform ad-hoc exploration of their data. This could be an analyst who just got access to a new dataset or data source and wants to spend time learning about the data before building a report off it. Or this could be a business user who needs to answer a specific question using the data to include in a PowerPoint presentation, but the report they’re using doesn’t answer the exact question they have. Creating a new report from scratch in these cases is a large hurdle, just to get a quick answer or screenshot for a deck.
Introducing the public preview of the new Explore feature , where users have a lightweight and focused experience to explore their data. Similar to exporting and building a PivotTable in Excel, now, directly within Power BI users can quickly launch Explore to begin creating a matrix/visual pair to get the answers they need without all the distractions and extra complexity of reports.
Simply find a dataset or report you’d like to explore:
Begin building your matrix/visual pair to get to the answers you need:
And, if you’d like to return to your work save it as an exploration:
Find more details in the Introducing Explore (Public Preview) blog post.
Copilot for Power BI in Microsoft Fabric
We are thrilled to announce the public preview of Copilot in Microsoft Fabric, including the experience for Power BI which helps users quickly get started by helping them create reports in the Power BI web experience. We’ve also added the Copilot’s unique ability to summarize data to the Smart Narrative visual, now rebranded as the Narrative with Copilot visual. The visual is available in the Power BI service and in Power BI Desktop. Lastly in Desktop, we’ve added the ability to generate synonyms synonyms for their fields, measures, and tables using Copilot. To use Copilot you’ll need access to a workspace that has a P1 or higher or a F64 or higher capacity.
Head over to our Ignite blog Empower Power BI Users with Microsoft Fabric and Copilot to read all the announcements related to Copilot. We’ll share more details in a dedicated blog next week.
Check out the Copilot for Power BI Docs for complete instructions and requirements and don’t hesitate to leave a comment in the Fabric Community site if you have any questions.
OneLake integration for Import-mode semantic models
We are absolutely thrilled to introduce yet another groundbreaking semantic model technology to the world! We are announcing the public preview of Microsoft OneLake integration for import models. With the click of a mouse button, you can enable OneLake integration and automatically write data imported into your semantic models to delta tables in OneLake, as depicted in the following diagram. The data is instantaneously and concurrently accessible through these delta tables. Data scientists, DBAs, app developers, data engineers, citizen developers and any other type of data consumer can now get seamless access to the same data that drives your business intelligence and financial reports. You can include these delta tables in your Lakehouses and Synapse Data Warehouses via shortcuts so that your users can use T-SQL, Python, Scala, PySpark, Spark SQL, R, and no-code/low-code solutions to query the data.
OneLake integration can even help you if you don’t plan to query the data. Perhaps you only want to export the data to backup files. Thanks to OneLake integration, this is very straightforward now. Ensure that your import-mode semantic model is hosted in a workspace on a Premium or Fabric capacity and that the large dataset storage format is enabled. Then, enable OneLake integration and perform a manual or scheduled data refresh operation. That’s it! The semantic model writes the imported data to the delta tables as part of the refresh operation. Exporting import-mode tables has never been easier. The delta tables are kept up to date without requiring any ETL pipelines copying data.
Of course, you can also export the data programmatically via Tabular Object Model (TOM) and Tabular Model Scripting Language (TMSL) if you can access your semantic model through XMLA in read-write mode. For example, you can open SQL Server Management Studio (SSMS) and run the following TMSL command (see also the screenshot below):
“database”: “ <Name of your database> “
If you have installed the latest version of OneLake File Explorer , you can conveniently verify the success of the export process by using Windows File Explorer. In OneLake File Explorer, right click on the workspace folder and select Sync from OneLake. Then, in the workspace folder, look for a subfolder with a name that matches your semantic model and that ends with .SemanticModel, as in the screenshot above. In this semantic model folder, every import-mode table has a subfolder that contains the delta table’s parquet files and delta log.
But you don’t need to know these file system details if you add shortcuts for your semantic model’s delta tables to other workloads in Fabric, such as lakehouses etc. Simply launch the Shortcut Wizard UI, pick Microsoft OneLake, select the semantic model, and then pick the tables you want to include, as in the screenshots below, and that’s it. You are ready to read and query the tables using your favorite data tools and APIs.
And there you have it! Now you can use Direct Lake mode to read delta tables directly from OneLake and write delta tables thanks to OneLake integration. Fabric is redefining how customers can build their BI solutions for faster performance at big-data scale while at the same time reducing Total Cost of Ownership (TCO) and infrastructure complexity. For example, you no longer need an entire portfolio of homegrown ETL solutions to get data volumes of any size in and out of semantic models. So, don’t delay and see for yourself how OneLake integration can help you maximize the return of your investments into semantic models by making the data instantaneously and concurrently accessible to data scientists, DBAs, app developers, data engineers, citizen developers and any other type of data consumer you may have in our organizations through delta tables added to your Lakehouses and Synapse Data Warehouses via shortcuts. And as always, provide us with feedback if you want to help deliver additional enhancements. We hope you are as excited about OneLake integration as we are. We think this is a massive innovation and are looking forward to hearing from you!
RLS/OLS security and stored credentials for Direct Lake semantic models
We are thrilled to announce the public preview of RLS/OLS security and stored credentials for Direct Lake semantic models. RLS/OLS security is a Power BI feature that enables you to define row-level and object-level access rules in a semantic model, so that different users can see different subsets of the data based on their roles and permissions. Stored credentials help reduce configuration complexity and are strongly recommended when using RLS/OLS with Direct Lake semantic models. The following screenshot shows how you can add users to RLS roles in a Direct Lake model by using the Web modeling experience. The web modeling security roles dialog will be fully deployed in the coming days or weeks. For more information about how to set up stored credentials, see the Direct Lake product documentation. For RLS and OLS, see the articles Row-level security (RLS) with Power BI and Object level security (OLS) .
There is (almost) nothing special for RLS/OLS in Direct Lake models. You can define roles and assign users as for any other semantic model type. But keep in mind that by default Direct Lake models use single sign-on (SSO) authentication to the underlying data source. Using RLS/OLS in conjunction with SSO can be challenging because it involves multiple authorization layers—RLS/OLS in the semantic model and user authorization at the data source. For example, if you wanted to authorize a new user, you would have to add that new user to appropriate RLS roles and ensure that the user has access permissions to the underlying delta tables in the lakehouse or data warehouse.
Managing user authorization at multiple layers adds complexity and friction. That’s why we are excited to introduce support for stored credentials with Direct Lake semantic models. Your semantic models can now access the delta tables at the source with a single, fixed identity on behalf of the users instead of delegating the actual user identities via SSO. When adding new users to an RLS role, you are effectively authorizing them to use the fixed identity. Because this approach avoids SSO-related complexity and friction, we strongly recommend that you switch to a fixed identity whenever you add RLS/OLS to a Direct Lake model. Switching to a fixed identity is as easy as binding the Direct Lake model to a Shareable Cloud Connection (SCC) that has SSO disabled. For more information, see Connect to cloud data sources in the Power BI service in the product documentation.
Here are the steps to configure a Direct Lake model with a fixed identity:
- Display the settings of the Direct Lake model and expand the Gateway and cloud connections section. Note that your Direct Lake model has a SQL Server data source pointing to a lakehouse or data warehouse in Fabric.
- Under Maps to, open the listbox, and click on Create a connection. This will pull you to the connections management page with the new connection form opened and prepopulated with the data source information.
- Select OAuth 2.0 or Service Principal as the authentication method and provide the credentials of the fixed identity you want to use.
- Make sure you disable to checkbox labeled Use SSO via Azure AD for DirectQuery queries, as in the following screenshot.
- Configure any other parameters as needed and then click Create. This pulls you back to the Direct Lake model settings page. Verify that the data source is now associated with the non-SSO cloud connection.
The ability to set up stored credentials is available today! The RLS editor for Direct Lake datasets in the web modeling experience is being deployed and will be visible in the coming days or weeks.
And that’s it for this announcement of RLS/OLS with fixed identities for Direct Lake semantic models. For more information see the articles about Direct Lake semantic models in the product documentation. We hope that these exciting new capabilities enable you to create and migrate even more Power BI semantic models to Direct Lake mode so that you can take full advantage of all the data movement, data science, real-time analytics, and Office integration, and AI, and BI capabilities that Fabric and Power BI have to offer. And please provide us with feedback if you want to help shape the future on world’s best and most successful BI service – Power BI on the unified Fabric platform! We always love to hear from you!
Learn about Direct Lake in Power BI and Microsoft Fabric – Power BI | Microsoft Learn
Sharable cloud connections for semantic models
Along with the general availability (GA) of shareable cloud connections (SCC), we are happy to announce that SCC support for semantic models and paginated reports is GA as well. Now, you can use this modern connection type in conjunction with your production semantic models and paginated reports to access cloud data sources and centralize cloud connection management. In enterprise organizations, centralizing cloud connection management in addition to data gateway management can help to lower the overhead of maintaining data connections and credentials. SCCs let you securely share access to cloud data sources through an access-control list. The credentials are protected and cannot be retrieved from the SCCs, but Power BI users with at least Use permissions can connect their semantic models and paginated reports to the cloud data sources through these SCCs. You can also create multiple connections to the same data source, which is particularly useful if you want to use different connection settings, such as different credentials, privacy settings, or single-sign-on settings, for different semantic models, paginated reports, and other artifacts.
Semantic model scale-out
We are thrilled to announce semantic model scale-out is now generally available (GA). Large-scale production solutions will benefit from high user concurrency, as Power BI automatically scales out read-only replicas to ensure performance doesn’t slow down when lots of users are using the system at the same time. And of course, automatic scale out works for Direct Lake semantic models! Additionally, Import-mode semantic models will benefit from refresh isolation, ensuring business users are unaffected by resource-intensive refresh operations, and continue to enjoy enable blazing-fast queries for interactive analysis.
Here’s a quick summary of the benefits semantic model scale-out can provide to your reports, dashboards, and other BI solutions:
- Increased query throughput Power BI can automatically scale read-only replicas when query volume increases and fluctuates.
- Refresh isolation Refresh and write operations on the read-write replica do not impact the query performance on read-only replicas.
- More flexibility for advanced data refresh scenarios As a side-benefit of refresh isolation, you can now perform advanced refresh operations on the read-write replica without impacting the read-only replicas. Simply disable automatic replica synchronization, then refresh, refresh, refresh until the read-write replica is fully updated, and then synchronize the read-replicas manually.
Semantic-model scale-out is the last of the key features to make Microsoft Fabric and Power BI a superset of Azure Analysis Services (AAS) and is superior in Fabric compared to its equivalent in AAS. Unlike AAS, scale out takes place based on live user demand, and adjusts automatically to changes in usage patterns. AAS, on the other hand, requires detailed analysis to determine peak usage times, creation of automation scripts, and ongoing monitoring to ensure optimum set up. Additionally, cost in AAS increases linearly per replica, unlike Fabric that is usage based.
Please refer to the Configure dataset scale-out article in the product documentation for details on how to enable semantic model scale-out.Along with the general availability (GA) of shareable cloud connections (SCC), we are happy to announce that SCC support for semantic models and paginated reports is GA as well. Now, you can use this modern connection type in conjunction with your production semantic models and paginated reports to access cloud data sources and centralize cloud connection management. In enterprise organizations, centralizing cloud connection management in addition to data gateway management can help to lower the overhead of maintaining data connections and credentials. SCCs let you securely share access to cloud data sources through an access-control list. The credentials are protected and cannot be retrieved from the SCCs, but Power BI users with at least Use permissions can connect their semantic models and paginated reports to the cloud data sources through these SCCs. You can also create multiple connections to the same data source, which is particularly useful if you want to use different connection settings, such as different credentials, privacy settings, or single-sign-on settings, for different semantic models, paginated reports, and other artifacts.
Show visuals as tables
Leveraging our previous accessibility improvements to table and matrix, we are now introducing a new view mode called Show visuals as tables which display report visuals in a tabular format with a single action. Some users may prefer to consume data in a text-based or tabular format depending on their different learning styles and usage of assistive technologies. This provides a supplemental format for visuals that allows users to display the data in the way that best meets their needs.
This new view mode is similar to how Show as a table displays underlying data for individual visuals today. Show visuals as tables will display the underlying data for visuals for all pages in the current report, with the added functionality of interaction and cross-filtering capabilities.
To activate this view mode, navigate to the view dropdown menu and select Show visuals as tables.
To revert, select Show original visuals.
Or simply use the keyboard shortcut Control + Shift + F11 to toggle between the two views.
Learn more details about this feature, including limitations, in our documentation: Consuming reports in Power BI with accessibility tools
- Performance Flow – xViz
- Galigeo For Power BI
- Calendar by Datanau
- Image Pro by CloudScope
- Sparkline by OKViz
- Apex Gantt Chart
With Zebra BI Tables 6.6 , you can add multiple text columns inside the table. Using this feature, you can bridge the gap in data visualization by having relevant information side by side in the world’s best table/matrix custom visual for Power BI.
Watch the video of the new functionality !
With this feature , you can now display multiple text columns in a tabular way , which leads to a better understanding of the data when displaying more attributes of the same Category. Additionally, there is no need to apply any complex DAX functions . Simply add additional text columns into the ‘Values’ placeholder.
SOME POPULAR USE CASES:
In one-to-one mapping , you usually need to add additional information next to the descriptive column to ensure data accuracy, consistency, and ease of reference.
- Product ID: Product name
- Product ID: SKU: Shipping/Order ID
- Job ID: Job title
The new feature also works in cases with one-to-many mapping . For example,
- Customer: Sales representatives because the same person can represent multiple customers.
You can also add multiple text columns when presenting data in a hierarchical table or using cross-tables for quarterly performance.
Try it on your data today for free .
The Funnel Chart by Powerviz is a stunning and informative visualization. It has 4 chart types in 1 visual, including a pyramid chart. The Power BI-certified visual is useful for tracking progress through different stages. It can also group data with legends for detailed insights.
- Funnel Settings : Multiple display types are available, including vertical and horizontal orientation.
- Data Colors : Offers 7 schemes and 30+ color palettes.
- Labels : Select from multiple display styles with a custom label option included.
- Conversion Rate : In a single click measure the percentage of top and bottom stages to identify bottlenecks.
- Fill Patterns : Highlight stages with custom or pre-filled patterns.
- Conditional Formatting – Create rules based on measure or category rules.
Many other features included are ranking, annotation, grid view, show condition, and accessibility support.
Business Use Cases:
- Sales Funnel Analysis : Track sales stages.
- Marketing Campaigns : Assess lead generation and conversion rates.
- User Onboarding : Monitor steps to product adoption.
- Website Traffic : Visualize visitor drop-offs.
Try Funnel Chart for FREE from AppSource .
Check out the visual features in the demo file .
Step by Step instructions and documentation.
To learn more, visit the Powerviz website.
Funnel Chart by Powerviz Feature Video on YouTube.
Drill Down Timeline PRO lets you easily create timelines with a date/time hierarchy. Click directly on the chart to drill down to examine specific periods in detail. Combine multiple series and choose between multiple chart types (line, column, area). Learn more.
Main features include:
- On-chart interactions – click on chart to drill down to months, days, hours, or milliseconds
- Customize up to 25 series
- DAX measure support
- Take customization to the next level with conditional formatting
- Use series defaults and value labels defaults to customize multiple series
- Static and dynamic thresholds – set up to 4 thresholds to demonstrate targets
- Full customization – colors, gradients, backgrounds, fonts, and more
- Touch device friendly – explore data on any device
Popular use cases:
- Banking & Finance – stock exchange indices, capital ratios, transaction volumes
- Sales & Marketing – web traffic, audience reach, sales revenue
- Information Technologies – network traffic, response times, syslog and error trends
- Manufacturing – quality metrics, uptime and downtime, production output and cost
Get Drill Down Timeline PRO from AppSource!
ZoomCharts Drill Down Visuals are known for their interactive drilldowns, smooth animations, rich customization options and support: interactions, selections, custom and native tooltips, filtering, bookmarks, and context menu.
Enhanced accessibility in paginated reports authored in Report Builder
The StructureTypeOverwrite property has been added to the .rdl model. You can use it to improve accessibility for paginated reports in Microsoft Report Builder and Power BI Report Builder. You can then publish these reports to the PBI service. Read more about improving accessibility of paginated report .
We are excited to announce the Microsoft Ignite: Microsoft Fabric Challenge as part of the Microsoft Learn Cloud Skills Challenge . Skill up for in-demand tech scenarios and enter to win a VIP pass to the next Microsoft Ignite. The challenge is on until January 15, 2024.
In this challenge, you will learn how to connect to data, ingest it with Data Factory and notebooks, store it in the lakehouse or data warehouse, and create Power BI reports to turn your data into competitive advantage.
The challenge will help you prepare for the Microsoft Certified: Fabric Analytics Engineer Associate certification and new Microsoft Applied Skills credentials covering the lakehouse and data warehouse scenarios, which are coming in the next months.
Use dynamic binding to maintain a single template of an RDL report that can be connected to multiple datasets across workspaces, instead of copying and maintaining hundreds of report duplicates. You don’t need to create copies of the same report with a different dataset. You can bind datasets dynamically to a paginated report as outlined in the “Bind datasets dynamically to a paginated report” documentation .
You can also bind datasets dynamically to a paginated report visual as outlined in the “Bind datasets dynamically to a paginated report visual” documentation .
Query Insights (QI) is a scalable, sustainable, and extendable solution to enhance the SQL analytics experience. With historical query data, aggregated insights, and access to actual query text, you can analyze and tune your SQL queries.
Query Insights provides a central location for historic query data and actionable insights for 30 days, helping you to make informed decisions to enhance the performance of your Warehouse or SQL Endpoint. When a SQL query runs in Microsoft Fabric, Query Insights collects and consolidates its execution data asynchronously, providing you with valuable information. Admin, Member, and Contributor roles can access the feature.
Query insights provide the following:
Historical Query Data : Query Insights stores historical data about query executions, enabling you to track performance changes over time. System queries aren’t stored in Query insights.
Aggregated Insights : Query Insights aggregates query execution data into insights that are more actionable, such as identifying long-running queries or most frequent queries.
There are three system views to provide answers to your key query performance analysis and turning related questions:
- queryinsights.exec_requests_history (Transact-SQL)
Returns information about each completed SQL request/query.
- queryinsights.long_running_queries (Transact-SQL)
Returns the information about queries by query execution time.
- queryinsights.frequently_run_queries (Transact-SQL)
Returns information about frequently run queries.
Autogenerated views are available under the queryinsights schema in SQL Endpoint and Warehouse.
Read more about Query Insights here: Query Insights – Microsoft Fabric | Microsoft Learn
We are excited to announce that the Data Warehouse now publishes all Inserts, Updates and Deletes for each table to their Delta Lake Log in OneLake!
Our vision is to break down data silos and make it easy to share data from your Data Warehouses with other teams who use different services without having to create copies of your data in different formats.
What does this mean?
Today, team members have a wide set of skills and varying comfort levels with different tools and query languages such as Python, T-SQL, KQL and DAX. Instead of having to create copies of your data in different formats for each tool and service, Fabric leverages Delta Lake as a common format across all its services. By only having one copy of your data, this makes it more secure, easier to manage, ensures the data is consistent across reports and it makes it faster and easier to share your data.
The Data Warehouse supports this by publishing Delta Lake Logs for every table that you create in your Data Warehouses. When you modify data within a Data Warehouse table, those changes will be visible in the Delta Lake Log within 1 minute of the transaction being committed.
For example, say you want to use Python to query a Data Warehouse table by using a Notebook in a Lakehouse. All you would need to do is to create a new shortcut in the Lakehouse and point it to the Data Warehouse Table. That table is now directly accessible by your Notebook and no data has been copied or duplicated! Data Scientists and Data Engineers are going to love how easy it is to incorporate your Data Warehouse Tables into their projects like Machine Learning and training AI models.
To learn more about how to create shortcuts that point to Data Warehouse Tables, please see this documentation article: Create a OneLake shortcut – Microsoft Fabric | Microsoft Learn.
You might wonder, how do I enable this? The answer is that you do not have to do anything! This all happens automatically with your Data Warehouses.
Note, only tables created going forward will have all DML published. If you have an older table that you wish to be fully published, you will need to use CTAS (Create Table as Select) to create a new copy of the table with all its data or drop the table and reload it.
To learn more about how to leverage your Data Warehouse’s data through its published Delta Lake Logs, please see our documentation Delta Lake logs in Warehouse – Microsoft Fabric | Microsoft Learn .
We are excited to announce automatic data compaction for Data Warehouses!
One of our goals with the Data Warehouse is to automate as much as possible to make it easier and cheaper for you to build and use them. This means you will be spending your time on adding and gaining insights from your data instead of spending it on tasks like maintenance. As a user, you should also expect great performance, which is where Data Compaction comes in!
Why is Data Compaction important?
To understand what Data Compaction is and how it helps, we need to first talk about how Data Warehouse Tables are physically stored in OneLake.
When you create a table, it is physically stored as one or more Parquet files. Parquet files are immutable which means that they cannot be changed after they are created. When you perform DML (Data Modification Language), such as Inserts and Updates, each transaction will create new Parquet files. Over time, you could have 1000s of small files. When reading parquet files, it is faster to read a few larger files than it is read many small files.
Another reason for Data Compaction is to remove deleted rows from the files. When you delete a row, the row is not actually deleted right away. Instead, we use a Delta Lake feature called Delete Vectors which are read as part of the table and let us know which rows to ignore. Delete Vectors make it faster to perform Deletes and Updates because we do not need to re-write the existing parquet files. However, if we have many deleted rows in a parquet file, then it takes more resources to read that file and know which rows to ignore.
How does Data Compaction happen?
As you run queries in your Data Warehouse, the engine will generate system tasks to review tables that potentially could benefit from data compaction. Behind the scenes, we then evaluate those tables to see if they would indeed benefit from being compacted.
The compaction itself is actually very simple! It is basically just re-writing either the whole table or portions of the table to create new parquet files or files that do not have any deleted rows and/or have more rows per file.
Data Compaction is one of the ways that we help your Data Warehouse to provide you with great performance and best of all, it involves no additional work from you! This helps give you more time to work on leveraging your Data Warehouse to gain more value and insights!
Please look forward to more announcements about more automated performance enhancements!
We are excited to announce that the Data Warehouse now supports sp_rename.
With sp_rename, you can rename user objects like Tables, Stored Procedures, Functions etc.
Here is an example of how to use sp_rename that fixes a spelling mistake for city:
sp_rename [dbo.dimension_ctiy], [dimension_city];
When you use these features, change the schema an object belongs to or drop a table, the changes will be reflected in OneLake within 1 minute.
For More information, please see our documentation:
sp_rename (Transact-SQL) – SQL Server | Microsoft Learn
We’re excited to announce a new, faster way to ingest data from CSV files into Fabric Warehouse: introducing CSV file parser version 2.0 for COPY INTO. The new CSV file parser builds an innovation from Microsoft Research’s Data Platform and Analytics group to make CSV file ingestion blazing fast on Fabric Warehouse.
The performance benefits you will enjoy with the new CSV file parser vary depending on the number of files you have in the source, the size of these files, and the data layout. Our testing revealed an overall improvement of 38% in ingestion times on a diverse set of scenarios, and in some cases, more than 4 times faster when compared to the legacy CSV parser.
The new CSV file parser is now available and is the new default file parser for CSV files during ingestion, so you do not need to do anything to enjoy its benefits. For more details, refer to our documentation on https://learn.microsoft.com/sql/t-sql/statements/copy-into-transact-sql?view=fabric&#parser_version—10–20- .
We are excited to announce that the Fabric warehouse enables you to execute queries that return the huge result sets. This capability is useful for scenarios where you need to export data into other systems, refresh large caches, or import substantial amounts of data from a Warehouse into Power BI datasets or other systems.
You can handle multi-terabyte results per query without impacting your workload. Warehouse leverages One Lake storage to store and buffer large temporary results before delivering them to the client apps. Smaller results are directly returned to the clients, while larger results are temporarily offloaded into lake storage and then streamed to the client. This feature allows you to work with exceptionally large data warehouses without worrying about the effect of large result sets on your queries and potential size limits of your results.
All query executions in Fabric Warehouse are now powered by the new technology recently deployed as part of the Global Resource Governance component that assigns compute resources in milliseconds! Warehouse workloads are dynamic and can unpredictably change leading to spikes and dips in number of resources needed for optimal execution. To meet this demand in real-time Global Resource Governance keeps track of all the compute resources in the region and keeps them in a ready state. This enables assignment in milliseconds providing seamless scale up experience allowing workloads to burst.
This is not all, Global Resource Governance improves reliability, efficiency, and performance and you can read all about it in a separate blog post covering all these benefits.
Previously, Fabric Warehouse utilized in-memory and SSD cache to store frequently accessed data on local disks in a highly optimized format. This significantly reduced IO latency and expedited query processing. As a result of this enhancement, file and rowgroup metadata are now also cached, further improving performance.
You can now remove spaces or specific characters from strings by using the keywords LEADING, TRAILING or BOTH in the TRIM SQL command .
TRIM ([ LEADING | TRAILING | BOTH] [characters FROM] string)
Generates a series of numbers within a given interval with the GENERATE_SERIES SQL command . The interval and the step between series values are defined by the user.
GENERATE_SERIES (start, stop [, step])
On July 5 th , we announced the ability to clone data warehouse tables within Microsoft Fabric as of current point in time. The ability to clone tables is a powerful technique that not only empowers businesses to streamline reporting and analytics but also helps expedite development and testing processes.
While data warehouses constantly evolve, it is often necessary to capture a snapshot of data as it existed at a particular moment in time. We are now excited to introduce the ability to clone tables with time travel , up to a default data history retention period of seven calendar days. Table clones can be created within and across schemas in the data warehouse within Microsoft Fabric.
Businesses can now unlock the ability to perform historical trend analysis, enabling them to compare data from various historical points. It empowers them to identify trends and facilitates making well-informed, data-driven decisions.
Cloning a table at previous time points offers the advantage of preserving historical data records, serving a valuable role in meeting various audit and compliance requirements. When data discrepancies occur, these clones not only assist in generating older table versions for root cause analysis but also help create older versions of the table for seamless business continuity.
Get started with table clone by creating your first clone either through T-SQL or through UX in the Microsoft Fabric portal.
We’re excited to announce the launch of RESTful Public APIs for Warehouse! With the warehouse public APIs, SQL developers can now automate their pipelines and establish CI/CD conveniently and efficiently. The warehouse REST Public APIs makes it easy for users to manage and manipulate Fabric Warehouse items. Here is the Warehouse REST API’s that are supported:
- Create Item
- Delete Item
- Get Item Definition
- Update Item
- Update Item Definition
You can learn the detailed usage following the Fabric REST APIs public documentation.
We are excited to announce SQLPackage support for Fabric Warehouses! SqlPackage is a command-line utility that automates the following database development tasks by exposing some of the public Data-Tier Application Framework (DacFx) APIs:
- Version : Returns the build number of the SqlPackage application. Added in version 18.6.
- Extract : Creates a data-tier application (. dacpac) file containing the schema or schema and user data from a connected SQL database.
- Publish : Incrementally updates a database schema to match the schema of a source. dacpac file. If the database does not exist on the server, the publishing operation creates it. Otherwise, an existing database will be updated.
- DeployReport : Creates an XML report of the changes that would be made by a publish action.
- DriftReport : Creates an XML report of the changes that have been made to a registered database since it was last registered.
- Script : Creates a Transact-SQL incremental update script that updates the schema of a target to match the schema of a source.
The SqlPackage command line tool allows you to specify these actions along with action-specific parameters and properties.
We are excited to announce the following user experience features to increase your productivity and provide seamless experience within the Warehouse and Lakehouse SQL analytics endpoint:
- Clone table user experience – Easily create zero-copy clone of tables by selecting table’s current or past point-in-time state.
- Save as view/table in visual query editor – No-code experience to save your query as a view or save results into the table via Visual query editor enables you to save your analysis for future use.
- Endorsement – Promote or certify your Warehouse or SQL analytics endpoint of Lakehouse to make it discoverable within your organization.
- Sample loading experience – Improved sample loading experience with performance improvements and visibility into steps to load sample data into your Warehouse.
- Viewers/shared recipients can save queries – We automatically save your queries for all users of the Warehouse, including viewers and shared recipients.
For more information on these, check out the blog: Insert blog link
Dynamic Data Masking is a powerful security feature that enables organizations to protect sensitive data while preserving the functionality of their applications. DDM allows you to define masking rules for specific columns in your database, ensuring that sensitive information is never exposed in its raw form to unauthorized users or applications.
With DDM, you can maintain data privacy and confidentiality without altering your core data structure or application logic.
We are thrilled to announce a significant step toward enhancing accessibility in our Lakehouse experience to provide a more inclusive and user-friendly interaction.
Here are the key initiatives and improvements we have implemented so far to support accessibility:
- Screen Reader Compatibility: Work seamlessly with popular screen readers, enabling visually impaired users to navigate and interact with our platform effectively.
- Text Reflow : Responsive design that adapts to different screen sizes and orientations. Text and content reflow dynamically, making it easier for users to view and interact with our application on a variety of devices.
- Keyboard Navigation: Improved keyboard navigation to allow users to move through Lakehouse without relying on a mouse, enhancing the experience for those with motor disabilities.
- Alternative Text for Images: All images now include descriptive alt text, making it possible for screen readers to convey meaningful information.
- Form Fields and Labels: All form fields have associated labels, simplifying data input for everyone, including those using screen readers.
We encourage all our users to share their thoughts and suggestions for further improvements. We’ll continue to monitor feedback from users and make ongoing improvements to maintain the highest standards of inclusivity.
We’ve introduced new capabilities to enhance the multi-tasking experience in Lakehouse. Our goal is to make your data management journey as efficient and user-friendly as possible. This latest enhancement includes the following changes designed to supercharge your productivity and streamline your daily tasks:
- Preserve Running Operations: Have an upload or data loading operation running in one tab and need to check on another task? No problem. With our enhanced multi-tasking, your running operation will not be canceled when you navigate between tabs. Focus on your work without interruptions.
- Retain Your Context: Selected objects, data tables, or files remain open and readily available when you switch between tabs. The context of your data Lakehouse is always at your fingertips.
- Non-Blocking List Reload: We’ve introduced a non-blocking reload mechanism for your files and tables list. You can keep working while the list refreshes in the background, ensuring that you work with the most up to date data without any interruption.
- Clearly Defined Notifications: Toast notifications will now specify which Lakehouse they are coming from, making it easier to track changes and updates in your multi-tasking environment.
We understand that managing a Lakehouse can involve numerous complex tasks, and this upgrade is designed to help you get more done in less time. Stay productive, stay efficient, and happy multi-tasking!
We are excited to introduce an upgraded DataGrid for Lakehouse table preview experience with advanced features designed to make working with your data even more seamless and powerful.
Here’s what you can look forward to in this upgrade:
- Sorting Data: Whether you’re working with large datasets or need to quickly identify trends, this feature will be a game-changer. Sort columns in ascending or descending order with a simple click, giving you full control over your data’s organization.
- Type in a keyword, and the DataGrid will instantly display matching results, helping you narrow down your search.
- Easily filter data by selecting from a list of available values. It’s a fast and efficient way to locate the exact data you’re looking for.
- Resizing Columns: Whether you want to prioritize certain data or view a wide range of fields, the ability to resize columns gives you full flexibility.
We are dedicated to providing you with the best tools and features to simplify your data management tasks. Stay tuned for more exciting updates in the future!
We understand that one of the core features of the Lakehouse you rely on for a successful end-to-end experience is the SQL endpoint, and that it’s crucial that it functions seamlessly to support your day-to-day needs.
We have been listening to your feedback, and today, we are delighted to introduce a significant improvement that empowers you to self-mitigate issues related to SQL endpoint provisioning. Our goal is to provide you with the tools to address any potential hiccups in a user-friendly manner, reducing any inconvenience you might encounter.
We now offer you the ability to retry SQL endpoint provisioning directly within the Lakehouse experience. This means that if your initial provisioning attempt fails, you have the option to try again without the need to create an entirely new Lakehouse.
We hope that this new feature will provide you with more seamless and reliable experience for your data management needs.
We are thrilled to introduce Microsoft Fabric Runtime 1.2, representing a significant advancement in our data processing capabilities. Microsoft Fabric Runtime 1.2 includes Apache Spark 3.4.1, Mariner 2.0 as the operating system, Java 11, Scala 2.12.17, Python 3.10, Delta Lake 2.4, and R 4.2.2, ensuring you have the most cutting-edge tools at your disposal. In addition, this release comes bundled with default packages, encompassing a complete Anaconda installation and essential libraries for Java/Scala, Python, and R, simplifying your workflow.
With the introduction of Runtime 1.2, Fabric is adjusted to support multiple runtimes, offering users the flexibility to seamlessly switch between them, minimizing the risk of incompatibilities or disruptions.
To change the runtime version at the workspace level, go to Workspace Settings > Data Engineering/Science > Spark Compute > Workspace Level Default, and select your desired runtime from the available options.
Once you make this change, all system-created items within the workspace, including Lakehouses, SJDs, and Notebooks, will operate using the newly selected workspace-level runtime version starting from the next Spark Session. If you are currently using a notebook with an existing session for a job or any lakehouse-related activity, that Spark session will continue as is. However, starting from the next session or job, the selected runtime version will be applied.
What changed? The default Spark session parameter spark.sql.sources.default is now ‘delta’.
All tables created using Spark SQL, PySpark, Scala Spark, and Spark R, whenever the table type is omitted, will create the table as Delta by default. If scripts explicitly set the table format, that will be respected. The command ‘USING DELTA’ in Spark creates table commands becomes redundant.
Scripts that expect or assume parquet table format should be revised. The following commands are not supported in Delta tables:
- ANALYZE TABLE $partitionedTableName PARTITION (p1) COMPUTE STATISTICS
- ALTER TABLE $partitionedTableName ADD PARTITION (p1=3)
- ALTER TABLE DROP PARTITION
- ALTER TABLE RECOVER PARTITIONS
- ALTER TABLE SET SERDEPROPERTIES
- INSERT OVERWRITE DIRECTORY
- SHOW CREATE TABLE
- CREATE TABLE LIKE
We are excited to announce that by default we have enabled the newly revamped and optimized Intelligent Cache feature in Fabric Spark. The Intelligent Cache works seamlessly behind the scenes and caches data to help speed up the execution of Spark jobs in Microsoft Fabric as it reads from your OneLake or ADLS Gen2 storage via shortcuts. It also automatically detects changes to the underlying files and will automatically refresh the files in the cache, providing you with the most recent data and when the cache size reaches its limit, the cache will automatically release the least read data to make space for more recent data. This feature lowers the total cost of ownership by improving performance by up to 60% on subsequent reads of the files that are stored in the available cache. You can learn more about this feature here: https://learn.microsoft.com/fabric/data-engineering/intelligent-cache
We are thrilled to share our latest enhancements in the monitoring hub, designed to provide customers with a comprehensive and detailed view of Spark and Lakehouse activities.
- Executor allocations: The executors’ information has been added as an additional column, responding to one of the top requests from customers. This feature allows users to view Spark executor allocations and utilization for optimization and core usage visibility.
- Runtime version: To support multiple Spark runtime versions, users can now view the Spark runtime version used for a Spark application in the monitoring hub and on the Spark application L2 page.
- Related items link: The detail page has been updated to include related item links corresponding to the snapshot or notebook with a refined UI.
- Columns customization: Users can now customize your preferred Spark columns in the monitoring hub, making it easier to sort, search, and filter.
To enhance the monitoring hub as a unified platform for viewing all Spark activities, including Lakehouse, users can now view the progress and status of Lakehouse maintenance jobs and table load activities. Users can also drill down to view more details on Lakehouse table-level operations.
Spark application resource Usage Analysis (preview)
Responding to customers’ requests for monitoring Spark resource usage metrics for performance tuning and optimization, we are excited to introduce the Spark resource usage analysis feature, now available in public preview. This newly released feature enables users to monitor allocated executors, running executors, and idle executors, alongside Spark executions. Users can zoom in and out as needed for both running and completed Spark applications. The feature also provides a calculated utilization efficiency, allowing users to assess the health of their resource utilization. Additionally, users can drill down to view the related jobs and tasks of the executors.
Rest API support for Spark Job Definition (preview)
We’re excited to announce the launch of RESTful Public APIs for Spark Job Definition! The SJD REST Public APIs makes it easy for users to manage and manipulate SJD items. Here are the Spark Job Definition REST API’s that are supported:
- Create SJD with Definition
- Get SJD Definition
- Update SJD Definition
- Get SJD run status
As a key requirement for workload integration, we are announcing the launch of RESTful Public APIs for Lakehouse! The Lakehouse REST Public APIs makes it easy for users to manage and manipulate Lakehouse artifacts items programmatically. The key capabilities of the Load to tables feature and table maintenance feature are also supported, guaranteeing programmability to data ingestion and that Delta tables are maintained in top performance for Fabric consumption.
Lakehouse support for git integration and deployment pipelines (preview)
The Lakehouse artifact now integrates with the lifecycle management capabilities in Microsoft Fabric, providing a standardized collaboration between all development team members throughout the product’s life. Lifecycle management facilitates an effective product versioning and release process by continuously delivering features and bug fixes into multiple environments.
Summary of git integration and deployment pipelines capabilities:
- Serialization of the Lakehouse object metadata to a git JSON representation.
- Apply changes directly or use pull requests to control changes to upstream or downstream workspaces and branches.
- The renaming of Lakehouses is tracked in git. The update of renamed Lakehouse will also rename default semantic data model and SQL Analytics endpoint.
- Deployment across Dev-Test-Production workspaces.
- Lakehouse can be removed as a dependent object upon deployment. Mapping different Lakehouses within the deployment pipeline context is also supported.
- Updates to Lakehouse name can be synchronized across workspaces in a deployment pipeline context.
We are thrilled to announce that the powerbiclient Python package is now natively supported in Fabric notebooks. This means you can easily embed and interact with Power BI reports in your notebooks with just a few lines of code.
You can also create stunning reports based on a Pandas dataframe or a Spark dataframe in the context of a notebook run.
Power BI reports in Fabric Notebooks are a good way to tell compelling data stories and share insights with others. To learn more about how to use the powerbiclient package, check out this article how to embed a Power BI component . Remember that you don’t need to do any extra set up on Fabric notebooks using Spark 3.4, just import the package and things will work!
We are thrilled to introduce a new API in mssparkutils called mssparkutils.notebook.runMultiple() which allows you to run multiple notebooks in parallel, or with a pre-defined topological structure.
With mssparkutils.notebook.runMultiple(), you can:
- Execute multiple notebooks simultaneously, without waiting for each one to finish.
- Specify the dependencies and order of execution for your notebooks, using a simple JSON format.
- Optimize the use of Spark compute resources and reduce the cost of your Fabric projects.
- View the Snapshots of each notebook run record in the output, debug/monitor your notebook tasks conveniently.
To learn more about this new API and how to use it, please refer to the documentation here. You can also try to run the mssparkutils.notebook.help(“runMultiple”) to find out more.
We now support uploading the .jar files in the Notebook Resources explorer. You can put your own compiled libs here, use Drag & Drop to generate a code snippet to install them in the session and load the libraries in code conveniently.
Notebook Git integration (preview)
Fabric notebooks now offer Git integration for source control using Azure DevOps. It allows users to easily control the notebook code versions and manage the Git branches by leveraging the Fabric Git functions and Azure DevOps.
Users can set up a connection to their Repo from the workspace settings following the Fabric Git integration Instruction s. Once connected, your notebook items will appear in the “Source control” panel together with other items inside the workspace. After the notebook artifact is committed to the Git repo successfully, from Azure DevOps Repos Files view, you’ll be able to see the artifact folder structure and source code there.
Additionally, when committing the notebook artifact to the Git repo, the code will be converted to a source code format (e.g., PySpark notebook to a ‘notebook-content.py’ file) instead of a standard ‘. ipynb’ file. This approach allows for easier code reviews using built-in diff features.
In the artifact content source file, the artifact’s metadata including lakehouses, markdown cells, and code cells will be preserved and distinguished, allowing for precise recovery when synced back to a Fabric workspace.
Notebook in Deployment Pipeline (preview)
Now you can also use notebooks to deploy your code across different environments, such as development, test, and production. This enables you to streamline your development process, ensure quality and consistency, and reduce manual errors. You can also use deployment rules to customize the behavior of your notebooks when they are deployed, such as changing the default Lakehouse of a Notebook.
You can follow the instructions Get started with deployment pipelines to set up your deployment pipeline, Notebook will show up in the deployment content automatically.
You will be able to compare content in different stages, deploy content from one stage to another, monitor the deployment history, and configure deployment rules.
Notebook REST APIs (preview)
We’re excited to announce the launch of RESTful Public API for the Notebook item! With the notebook public APIs, data engineers/data scientists can now automate their pipelines and establish CI/CD conveniently and efficiently. The notebook Restful Public API can make it easy for users to manage and manipulate Fabric notebook items and integrate notebook with other tools and systems. Here is the list of supported Notebook REST APIs:
- Notebook Management: Create Item/Delete Item/Get Item/Get Item Definition/List Item/Update Item/Update Item Definition
- Notebook Job Scheduler: Run on Demand Item Job (with parameterization support)/ Cancel Item Job Instance/ Get Item Job Instance
You can learn more by checking out the Fabric REST APIs public documentation.
We are thrilled to announce public preview of the environment item in Fabric. The environment is a centralized item that allows you to configure all the required settings for running a Spark job in one place. It provides a unified interface for managing, maintaining, and sharing the configurations. In an environment, you can select different Spark runtime, refine the compute resources, install libraries, and more.
Once you have configured your environment, you can attach it to your Notebooks, Spark job definitions or even the workspace, serving as workspace default. The configurations will be effective, and the libraries will be available once the Spark session is started with the environment attached.
Synapse VS Code extension in vscode.dev (preview)
We are excited to introduce the public preview of the Synapse VS Code extension on vsocde.dev. With support for vscode.dev, a lightweight version of VS Code that operates entirely within your web browser, users can now seamlessly edit and execute Fabric Notebooks without ever leaving their browser window. Additionally, all the native pro-developer features of VS Code are now accessible to end-users in this environment.
Any code modifications made within vscode.dev will be instantly applied to the workspace. If users still prefer working within their local development environment, they can opt to download and edit their notebooks using the desktop version of VS Code.
Copilot in notebooks (preview)
We’re excited to announce the public preview of Copilot in Fabric Data Science and Data Engineering notebooks. Copilot is designed to accelerate productivity, provide helpful answers and guidance, and generate code for common tasks like data exploration, data preparation and machine learning with. You can interact and engage with the AI from either the chat panel or even from within notebooks cells using magic commands to get insights from data faster. Note that this Copilot public preview does not require any sign-up and will gradually be rolled out to customers during the coming months.
Learn more about Copilot in Microsoft Fabric
The chat pane experience makes seeking help and insights into data easy on top of notebooks. Users can use natural language to ask questions about their data, generate code snippets based on prompt inputs, and even ask Copilot to provide helpful next steps in the notebook. The Copilot chat panel shares all background context with the notebook inside of a spark session, so any executed cells or dataframes added to the notebook are available to the chat panel.
- Chat magics:
Chat magics are magic commands, such as %%chat and %%code, which can be used inside of a notebook to make requests to Copilot to provide insightful recommendations. Magics can provide assistance directly from within the notebook cell output and can be persisted as output into the notebook.
Custom Python Operations in Data Wrangler
Data Wrangler, a notebook-based tool for exploratory data analysis, has always allowed users to browse and apply common data-cleaning operations, generating the corresponding code in real time. Now, in addition to generating code from the UI, users can also write their own code with custom operations. Like every transformation in Data Wrangler, custom operations update the display grid, offering a live preview of each step before it is applied.
- Getting Started : You can initiate a custom operation either by selecting “Custom operation” from the lefthand Operations panel or by typing directly into the code editor below the display grid.
- Modifying the DataFrame : Your custom code will be previewed in the display grid in real time. Note that you can modify the DataFrame variable (df’) directly or set it equal to a new value (df = . . .’).
- Previewing the Transformation : Once you’ve stopped typing, the effect of your custom operation will be previewed in the display grid, as with any other Data Wrangler transformation. You can apply or discard the step or edit the code to modify the transformation.
Data Wrangler for Spark DataFrames (preview)
We’re pleased to announce that Data Wrangler now supports Spark DataFrames in public preview. Until now, users have been able to explore and transform pandas DataFrames using common operations that can be converted to Python code in real time. The new release allows users to edit Spark DataFrames in addition to pandas DataFrames. In both cases, you can now customize the sample that Data Wrangler displays, specifying the number of rows to show and choosing from among three sampling methods (first records, last records, or a random subset). In order to maintain performance, Spark DataFrames are automatically converted to pandas samples in Data Wrangler’s preview grid, but all the generated code will be converted to PySpark when it is exported back to your notebook.
This built-in sampling ensures that Data Wrangler feels consistent no matter what kind of data you’re working with. If you’re already familiar with Data Wrangler for your pandas use cases, you’ll notice only a few changes when you open a Spark DataFrame.
- Launching Data Wrangler : Beneath the list of pandas variables in the Data Wrangler dropdown, you’ll now see a list of Spark variables and an option to choose a custom sample.
- Exploring and Transforming Data : Once the tool loads, an informational banner above the preview grid will remind you, if you’re working with Spark data, that all the code generated by Data Wrangler will be converted to PySpark when it’s added back to your notebook.
- Exporting Code : Browsing operations and applying transformations remain the same, but when you’re ready to export the committed steps as code, Data Wrangler will display a preview of the final PySpark and provide the option to save the interim pandas code for further exploration.
We’ve introduced the MLflow inline authoring widget, enabling users to effortlessly track their experiment runs along with metrics and parameters, all directly from within their notebook. Additionally, users can access a Run Comparison view, where they can visually compare runs using various plot types and customizations. Last, for those seeking even more insights, you can dive into the Experiment item directly from your notebook.
We have made significant enhancements to our model and experiment tracking features based on valuable user feedback. The new tree-control in the run details view makes tracking easier by showing which run is selected.
Furthermore, we’ve enhanced the comparison feature, allowing you to easily adjust the comparison pane for a more user-friendly experience. Run names are now highlighted across plots, making them easier to spot.
In addition, transitioning to the Run Details view is now effortless – simply click on the run name. Plus, users can now apply filters to any column, offering increased flexibility and user-friendliness, even beyond their customized columns view.
We’ve made it simpler for users to check out recent runs for an experiment directly from the workspace list view. This update makes it easier to keep track of recent activity, quickly jump to the related Spark application, and apply filters based on the run status.
We’re excited to share some important updates regarding a terminology shift in Fabric. We’re evolving our terminology from “Models” to “ML Models” to ensure clarity and avoid any confusion with other Fabric elements. This change will be visible in key UI elements and updated documentation, though APIs remain unaffected. While we understand that this may introduce some initial adjustments, it’s a necessary step towards enhancing the overall usability and comprehensibility of the product. This transition aligns with the upcoming general availability of Fabric and mirrors the rename of Power BI Datasets to Semantic Models, making Fabric even more user-friendly.
To learn more, you can visit: Machine learning experiments in Microsoft Fabric.
We are pleased to announce the release of SynapseML v1.0, our open-source library that simplifies the creation of massively scalable machine learning (ML) pipelines. SynapseML makes it easy to build production ready machine learning systems on Fabric and has been in use at Microsoft for over 6 years.
This milestone release introduces new integrations with Vector Search engines for efficient management of GPT embeddings, advanced conversational AI and LLM capabilities, Orthogonal Forest DoubleML for robust causal inference, key-free authentication for Azure AI services on Fabric, and much more. This release underscores our commitment to providing powerful open-source tools for your machine learning needs and supporting the SynapseML community.
We’ve introduced a scalable implementation of Explainable Boosting Machines (EBM) powered by Apache Spark in SynapseML. EBMs are a powerful machine learning technique that combines the accuracy of gradient boosting with a strong focus on model interpretability. With this addition, you can now leverage the EBM model in your machine learning endeavors within SynapseML, making it easier to gain insights and transparency in domains where understanding model decisions is crucial, such as healthcare, finance, and regulatory compliance.
You can learn how to train EBMs for classification or regression scenarios.
We are excited to announce the public preview for prebuilt AI models in Fabric. Fabric seamlessly integrates with Azure AI services, allowing you to enrich your data with prebuilt AI models without any prerequisites. Azure OpenAI Service , Text Analytics , and Azure AI Translator are available out of the box in Fabric, with support for both RESTful API and SynapseML. You can also use the OpenAI Python Library to access Azure OpenAI service in Fabric. For more information on available models, see prebuilt AI models in Fabric .
We have added support for a new connection method called “synapse” in sparklyr, which enables users to connect to an existing Spark session. Additionally, we have contributed this connection method to the OSS sparklyr project. Users can now use both sparklyr and SparkR in the same session and easily share data between them.
We’re pleased to begin rolling out a set of public-facing REST APIs for the main data science artifacts: the ML Experiment and the ML Model. These APIs begin to empower users to create and manage machine-learning artifacts programmatically, a key requirement for pipeline automation and workload integration. Stay tuned in the coming months for updates about more robust API support. For now, the following APIs are supported:
- List ML Experiments in a Workspace
- List ML Models in a Workspace
- Create ML Experiment
- Create ML Model
- Get ML Experiment
- Get ML Model
- Update ML Experiment
- Delete ML Experiment
- Delete ML Model
We’re thrilled to announce an update to the Data Science Happy Path tutorial for Microsoft Fabric. This new comprehensive tutorial demonstrates the entire data science workflow, using a bank customer churn problem as the context. It’s the perfect resource to kickstart your data science journey on Fabric. The updated tutorial includes the following key steps:
- Ingest data into a Fabric lakehouse using Apache Spark.
- Load existing data from delta tables in the lakehouse.
- Clean and transform data with tools like Apache Spark and Fabric Data Wrangler.
- Create experiments and runs to train various machine learning models.
- Register and track trained models using MLflow and the Fabric UI.
- Perform scalable scoring with the PREDICT function and save results to the lakehouse.
- Visualize predictions and business insights in Power BI using Direct Lake.
We are excited to announce that we have expanded our collection of data science samples by introducing new categories of samples that include end-to-end workflow samples in R and Quick tutorials samples. The two new end-to-end R samples demonstrate the entire data science workflow for the bank customer churn problem and the credit card fraud detection problem using the R language. The two new “Quick tutorial” samples also include “Explaining Model Outputs” and “Visualizing Model Behavior.” The first tutorial will walk you through the process of using SHapley Additive exPlanations to gain deeper insights into model outputs. The second tutorial shows how to leverage Partial Dependence and Individual Conditional Expectation plots to interpret the impact of various features on the model outputs.
Please check out these new samples and let us know your thoughts, as we are committed to continually improving your data science experience on Microsoft Fabric.
We are happy to introduce our newest Data Science sample – Sales Forecasting – developed in collaboration with Sonata Software . This new sample encompasses the entire data science workflow, spanning from data cleaning to Power BI visualization. It is designed to forecast product sales in a superstore, harnessing the power of the SARIMAX algorithm. Within this sample, you will discover comprehensive guidance on model training and parameter optimization, helping you gain a deep understanding of how to make accurate forecast predictions.
The sample also includes a Power BI report that enables you to transform your data-driven forecasts into compelling visualizations that are both informative and visually appealing.
To achieve seamless data access across all compute engines in Microsoft Fabric, Delta Lake is chosen as the unified data lake table format.
As part of the one logical copy promise, we are excited to announce that data in KQL Database can now be made available in OneLake in delta parquet format.
The data streamed to KQL Database is stored in an optimized columnar storage format with full text indexing and supports complex analytical queries at low latency on structured, semi-structured, and free text data.
Enabling KQL DB data availability in OneLake means that customers enjoy best of both worlds: they can query the data with high performance and low latency in KQL DB and query the same data in delta parquet available in OneLake via any other Fabric engines (Direct Lake mode in Power BI, Data warehouse, Lakehouse, Notebooks etc.). KQL DB provides a robust mechanism to batch the incoming streams of trickling data into Delta parquet files suitable for analysis. The Delta representation is provided to keep the data open and reusable, however, it is managed once, it is paid for once and users should consider it a single data set.
Users will only be charged once for the data storage after enabling the KQL DB data availability in OneLake.
Enabling the feature is quite simple. All that is needed is for the user to enable the Data Availability option of the selected KQL Database:
Once the feature is enabled, you should be able to see all the new data added to your database at the given OneLake path in Delta parquet.
You can now access this Delta table by creating a OneLake shortcut from Lakehouse, Data warehouse or directly via Power BI Direct Lake mode.
You can enable the OneLake availability at a table or database level.
Microsoft Fabric has formally announced the release of several open-source connectors for real-time analytics. These connectors enable users to ingest data from various sources and process it using KQL DB.
We have released a new Get Data Experience to simplify data ingestion process In KQL DB! Designed with simplicity and efficiency in mind, this update streamlines the way you bring data into KQL DB. Whether you’re a seasoned data professional or just starting your data exploration journey, this new user experience is crafted to empower you every step of the way.
Simplifying Your Data Flow
The primary objective of the new Get Data Experience is to streamline your workflow. We understand the importance of efficiency in data exploration, and that’s why we’ve reimagined the process to minimize the number of clicks required, ensuring that you can bring data from all your familiar data sources including Event streams, One Lake, local file, Azure storage, Event Hubs, and Amazon S3.
The Get Data experience is powered by a new Guided Wizard. This step-by-step assistant takes you through the entire data ingestion process, ensuring that you’re never left in the dark. Whether you’re extracting data from various sources, transforming it to suit your needs, or loading it into KQL DB, the Guided Wizard is your knowledgeable co-pilot, offering insights and guidance at every turn.
The supported data resources that are supported by dedicated guided wizards are:
- Event Streams: Route Event Streams real-time events to KQL database destinations with no-code experience
- Onelake: OneLake comes automatically with every Microsoft Fabric tenant and is designed to be the single place for all your analytics data. You can seamlessly ingest data from OneLake into your KQL Database with just a few clicks.
- Local File: This is the simplest way to insert data into a KQL Database. You can upload a file from your local machine using the Get Data UX. This option is useful when you have a small amount of data that you want to analyze quickly.
- Azure Storage: If your data is stored in Azure Blob Storage or Azure Data Lake Storage Gen2, you can use built-in support for Azure storage to ingest it.
- Event Hubs: If you are using Azure Event Hubs to stream your data, you can seamlessly create a connection to your Azure resource from the wizard.
- Amazon S3: If your data is stored in Amazon S3, you can use built-in S3 support to ingest it. All you need is the URI of S3 bucket or individual blobs.
We’re excited to announce the launch of RESTful Public APIs for KQL DB. The Public REST APIs of KQL DB will enable users to manage and automate their flows programmatically. Here is the list of support REST APIs for the KQL DB:
- List KQL DBs
- Get KQL DB details
- Update KQL DB definition
- Delete KQL DB
You can learn the detailed usage following the Fabric REST APIs public documentation.
Splunk Add-on (preview)
Microsoft Fabric Add-On for Splunk allows users to ingest logs from Splunk platform into a Fabric KQL DB using the Kusto python SDK.
When we add data to Splunk, the Splunk indexer processes it and stores it in a designated index. Searching in Splunk involves using the indexed data for the purpose of creating metrics, dashboards and alerts. This Splunk add-on triggers an action based on the alert in Splunk. We then use Alert actions to send data to Microsoft Fabric KQL DB using the specified addon.
The add-on uses Kusto python SDK ( https://learn.microsoft.com/azure/data-explorer/kusto/api/python/kusto-python-client-library ) to send log data to Microsoft Fabric KQL DB. The addon supports queued mode of ingestion by default. The addon boasts of a durable feature which helps to minimize data loss during any unexpected network error scenarios. But durability in ingestion may impact throughput, so it is advised to use this option wisely.
The following describes the stages needed:
- Download and install the Microsoft Fabric Addon
- Create an Splunk Index
- Create a KQL DB and table
- Create a Splunk Alert that defines the data to be ingested into KQL DB and select the Trigger Actions = “Send to Microsoft”.
- Configure the connection details to point to the KQL DB
Event Streams is now Generally Available
We’re pleased to announce Event Streams GA with a slew of enhancements aimed at taking your data processing experience to the next level. These updates, focused on enhancing efficiency and user experience, include data transformation to the KQL Database, seamless data ingestion from Eventstream in other Fabric items, and optimization of Lakehouse tables. These changes are made to transform the way you manage your data streams.
Event Streams Data Transformation for KQL DB (Generally Available)
Now, you can transform your data streams in real time within Eventstream before they are sent to your KQL Database. When you create a KQL Database destination in the Eventstream, you can set the ingestion mode to “Event processing before ingestion” and add event processing logic such as filtering and aggregation to transform your data streams. With this feature, you have greater control and flexibility over your data streams, ensuring that it’s perfectly structured for your database.
Get Data from Eventstream Anywhere in Fabric
If you’re working on other Fabric items and are looking to ingest data from Eventstream, our new “Get data from Eventstream” feature is here to help you. This feature enables you to seamlessly ingest Eventstream data to various Fabric items, offering both flexibility and convenience. It simplifies the process of routing your data streams efficiently throughout the entire Fabric environment.
Create a Cloud Connection within Eventstream
We’ve simplified the process of establishing a cloud connection to your Azure services within Eventstream. When adding an Azure resource, such as Azure IoT Hub and Azure Event Hubs, to Eventstream, you can now create the cloud connection and enter your Azure resource credentials right within Eventstream. This enhancement significantly improves the process of adding new data sources to Eventstream, saving you time and effort.
To cater to your specific requirements, we’ve introduced two distinct ingestion modes for your Lakehouse destination. You can select one of these modes to optimize how Eventstream writes to Lakehouse based on your scenario.
- Rows per file – You can now specify the minimum number of rows that Lakehouse ingests in a single file. The smaller the minimum number of rows, the more files Lakehouse will create during ingestion.
- Duration – You can now specify the maximum duration that Lakehouse would take to ingest a single file. The longer the duration, the more rows will be ingested in a file.
Our customers have inquired about compacting numerous small streaming files generated on a Lakehouse table. We have a solution for you! Table optimization shortcut is now available inside Eventstream Lakehouse destination – This helps the user by opening a Notebook with Spark job which would compact small streaming files in the destination Lakehouse table.
The connectors for Lakehouse, Warehouse and KQL Database are officially out of public preview and generally available. We encourage you to use these connectors when trying to connect to data from any of these Fabric experiences.
When you set Dataflow for a refresh, the dataflow refresh process has a timeout of 8 hours. This means that your Dataflow cannot run for more than 8 hours for a single refresh job. However, during those 8 hours you could be consuming resources and the operation might be in a loop due to an unforeseen situation with your data source or one of the logics in your Dataflow.
To prevent unnecessary resources from being consumed, we’ve implemented a new mechanism that will stop the refresh of a Dataflow as soon as the results of the refresh are known to have no impact. This is to reduce consumption more proactively.
We made diagnostics improvement to provide meaningful error messages when Dataflow refresh fails for those Dataflows running through the Enterprise Data Gateway.
Column binding is described as follows: Data fetched from the data source is returned to the application in variables that the application has allocated for this purpose. Before this can be done, the application must associate, or bind , these variables to the columns of the result set; conceptually, this process is the same as binding application variables to statement parameters. When the application binds a variable to a result set column, it describes that variable – address, data type, and so on – to the driver. The driver stores this information in the structure it maintains for that statement and uses the information to return the value from the column when the row is fetched.
Enabling support for column binding for SAP HANA as an optional parameter will result in significantly improved performance for users that opt into the feature.
When using a Dataflow Gen2 in Fabric, the system will automatically create a set of staging artifacts which support the Dataflow Gen2 operation by providing storage and compute engine for processes that require it such as staging of queries or the Data destination feature. You can find more details regarding this in this earlier blog post: Data Factory Spotlight: Dataflow Gen2 | Microsoft Fabric Blog
Starting today, these artifacts will be abstracted from the Dataflow Gen2 experience and will be hidden from the workspace list. No action is required by the user and this change has no impact on any Dataflows that you’ve created before.
The Power Query Editor user interface used inside of Dataflows Gen2 has been updated to match the style used throughout Microsoft Fabric.
Support for VNET Gateways (preview)
We’re also very excited to announce that we’re releasing VNet Data Gateway support for Dataflows Gen2 in Fabric as a Public Preview feature today. The VNet data gateway helps customers to connect from Fabric Dataflows Gen2 to their Azure data services within a VNet without the need of an on-premises data gateway. The VNet data gateway securely communicates with the data source, executes queries, and transmits results back to Fabric.
You can learn more about VNET Gateways from the following link .
You can now clone your data pipelines across workspaces by using the “Save as” button. This makes it easier to develop pipelines collaboratively inside Fabric workspaces without having to redesign your pipelines from scratch.
Dynamic content flyout integration with Email and Teams activity
In the Email and Teams activities, you can now add dynamic content with ease. With this new pipeline expression integration, you will now see a flyout menu to help you select and build your message content quickly without needing to learn the pipeline expression language.
Copy activity now supports fault tolerance for Fabric Data Warehouse connector.
The Copy activity in data pipelines now supports fault tolerance for Fabric Data Warehouse. Fault tolerance allows you to handle certain errors without interrupting data movement. By enabling fault tolerance, you can continue to copy data while skipping incompatible data like duplicated rows.
We’re excited to announce that the MongoDB and MongoDB Atlas connectors are now available to use in your Data Factory data pipelines as sources and destinations. In your data pipeline, you can create a new connection to your MongoDB or MongoDB Atlas data source to copy, extract, and transform your data.
Microsoft 365 connector now supports ingesting data into Lakehouse (preview)
We’re excited to share that the Microsoft 365 connector now supports ingesting data into Lakehouse tables.
You can now open and edit data pipelines from different workspaces and navigate between them using the multi-tasking capabilities in Fabric.
You can now edit your data connections within your data pipelines. Previously, a new tab would open when connections needed editing. Now, you can remain within your pipeline and seamlessly update your connections.
Community and Learning
We are excited to announce the Microsoft Ignite: Microsoft Fabric Challenge as part of the Microsoft Learn Cloud Skills Challenge . Skill up for in-demand tech scenarios, and enter to win a VIP pass to the next Microsoft Ignite. The challenge is on until January 15, 2024.
Related blog posts
Announcing delta lake support in real-time analytics kql database.
As part of the One logical copy effort, we’re excited to announce that you can now enable availability of KQL Database in Delta Lake format. Delta Lake is the unified data lake table format chosen to achieve seamless data access across all compute engines in Microsoft Fabric. The data streamed into KQL Database is stored … Continue reading “Announcing Delta Lake support in Real-Time Analytics KQL Database”
Announcing General Availability: Explore the Capabilities of Real-Time Analytics in Microsoft Fabric
In the fast-paced world of data analytics, real-time insights are the driving force behind informed decisions and competitive advantage. The long-awaited moment is here: Real-Time Analytics in Microsoft Fabric has reached general availability (GA), unveiling a wide range of transformative features and capabilities to empower data-driven professionals across diverse domains. Whether you’re an experienced business … Continue reading “Announcing General Availability: Explore the Capabilities of Real-Time Analytics in Microsoft Fabric”
- Power BI forums
- News & Announcements
- Get Help with Power BI
- Report Server
- Power Query
- Mobile Apps
- DAX Commands and Tips
- Custom Visuals Development Discussion
- Health and Life Sciences
- Power BI Spanish forums
- Translated Spanish Desktop
- Power Platform Integration - Better Together!
- Power Platform Integrations
- Power Platform and Dynamics 365 Integrations
- Training and Consulting
- Instructor Led Training
- Community Connections & How-To Videos
- COVID-19 Data Stories Gallery
- Themes Gallery
- Data Stories Gallery
- R Script Showcase
- Webinars and Video Gallery
- Quick Measures Gallery
- 2021 MSBizAppsSummit Gallery
- 2020 MSBizAppsSummit Gallery
- 2019 MSBizAppsSummit Gallery
- Custom Visuals Ideas
- Upcoming Events
- Community Engagement
- T-Shirt Design Challenge 2023
- Community Blog
- Power BI Community Blog
- Custom Visuals Community Blog
- Community Support
- Community Accounts & Registration
- Using the Community
- Community Feedback
Fabric is Generally Available. Browse Fabric Presentations . Work towards your Fabric certification with the Cloud Skills Challenge .
- Power BI Report Server vs SQL Server Reporting Ser...
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Printer Friendly Page
- All forum topics
- Previous Topic
- Mark as New
- Report Inappropriate Content
Power BI Report Server vs SQL Server Reporting Services (SSRS)
- Show and Tell
- Tutorial Requests
Power BI Monthly Update - November 2023
Check out the November 2023 Power BI update to learn about new features.
Fabric Community News unified experience
Read the latest Fabric Community announcements, including updates on Power BI, Synapse, Data Factory and Data Activator.
Exclusive opportunity for Women!
Join us for a free, hands-on Microsoft workshop led by women trainers for women where you will learn how to build a Dashboard in a Day!
The largest Power BI and Fabric virtual conference
130+ sessions, 130+ speakers, Product managers, MVPs, and experts. All about Power BI and Fabric. Attend online or watch the recordings.