Archive for the ‘Ai’ Category

Unlock real-time insights with AI-powered analytics in Microsoft Fabric – Microsoft

The data and analytics landscape is changing faster than ever. From the emergence of generative AI to the proliferation of citizen analysts to the increasing importance of real-time, autonomous action, keeping up with the latest trends can feel overwhelming. Every trend requires new services that customers must manually stitch into their data estatedriving up both cost and complexity.

With Microsoft Fabric, we are simplifying and future-proofing your data estate with an ever-evolving, AI-powered data analytics platform. Fabric will keep up with the trends for you and seamlessly integrate each new capability so you can spend less time integrating and managing your data estate and more time unlocking value from your data.

Set up Fabric for your business and discover resources that help you take the first steps

Aurizon, Australias largest rail freight operator, turned to Fabric to modernize their data estate and analytics system.

With Microsoft Fabric, weve answered many of our questions about navigating future growth, to remove legacy systems, and to streamline and simplify our architecture. A trusted data platform sets us up to undertake complex predictive analytics and optimizations that will give greater surety for our business and drive commercial benefits for Aurizon and our customers in the very near future.

Aurizon is just one among thousands of customers who have already used Fabric to revolutionize how they connect to and analyze their data. In fact, a 2024 commissioned Total Economic Impact (TEI) study conducted by Forrester Consulting found that Microsoft Fabric customers saw a three-year 379% return on investment (ROI) with a payback period of less than six months. We are thrilled to share a huge range of new capabilities coming to Fabric. These innovations will help you more effectively uncover insights and keep you at the forefront of the trends in data and analytics. Check out a quick overview of the biggest changes coming to Fabric.

Prepare your data for AI innovation with Microsoft Fabricnow generally available

Fabric is a complete data platformgiving your data teams the ability to unify, transform, analyze, and unlock value from data from a single, integrated software as a service (SaaS) experience. We are excited to announce additions to the Fabric workloads that will make Fabrics capabilities even more robust and even customizable to meet the unique needs of each organization. These enhancements include:

When we introduced Fabric, it launched with seven core workloads which included Synapse Real-time Analytics for data streaming analysis and Data Activator for monitoring and triggering actions in real-time. We are unveiling an enhanced workload called Real-Time Intelligence that combines these workloads and brings an array of additional new features, in preview, to help organizations make better decisions with up-to-the-minute insights. From ingestion to transformation, querying, and taking immediate action, Real-Time Intelligence is an end-to-end experience that enables seamless handling of real-time data without the need to land it first. With Real-Time Intelligence, you can ingest streaming data with high granularity, dynamically transform streaming data, query data in real-time for instant insights, and trigger actions like alerting a production manager when equipment is overheating or rerunning jobs when data pipelines fail. And with both simple, low-code or no-code, and powerful, code-rich interfaces, Real-Time Intelligence empowers every user to work with real-time data.

Behind this powerful workload is the Real-time hub, a single place to discover, manage, and use event streaming data from Fabric and other data sources from Microsoft, third-party cloud providers, and other external data sources. Just like the OneLake data hub makes it easy to discover, manage, and use the data at rest, the Real-time hub can help you do the same for data in motion. All events that flow through the Real-time hub can be easily transformed and routed to any Fabric data store and users can create new streams that can be discovered and consumed. From the Real-time hub, users can gain insights through the data profile, configure the right level of endorsement, set alerts on changing conditions and more, all without leaving the hub. While the existing Real-Time Analytics capabilities are still generally available, the Real-time hub and the other new capabilities coming to the Real-Time Intelligence workload are currently in preview. Watch this demo video to check out the redesigned Real-Time Intelligence experience:

Elcome, one of the worlds largest marine electronics companies, built a new service on Fabric called Welcome that helps maritime crews stay connected to their families and friends.

Microsoft Fabric Real-Time Intelligence has been the essential building block thats enabled us to monitor, manage, and enhance the services we provide. With the help of the Real-time hub for centrally managing data in motion from our diverse sources and Data Activator for event-based triggers, Fabrics end-to-end cloud solution has empowered us to easily understand and act on high-volume, high-granularity events in real-time with fewer resources.

Real-time insights are becoming increasingly critical across industries like route optimization in transportation and logistics, grid monitoring in energy and utilities, predictive maintenance in manufacturing, and inventory management in retail. And since Real-Time Intelligence comes fully optimized and integrated in a SaaS platform, adoption is seamless. Strathan Campbell, Channel Environment Technology Lead at One NZthe largest mobile carrier in New Zealandsaid they went from a concept to a delivered product in just two weeks. To learn more about the Real-Time Intelligence workload, watch the Ingest, analyze and act in real time withMicrosoft Fabric Microsoft Build session or read the Real-Time Intelligence blog.

Fabric was built from the ground up to be extensible, customizable, and open. Now, we are making it even easier for software developers and customers to design, build, and interoperate applications within Fabric with the new Fabric Workload Development Kitcurrently in preview. Applications built with this kit will appear as a native workload within Fabric, providing a consistent experience for users directly in their Fabric environment without any manual effort. Software developers can publish and monetize their custom workloads through Azure Marketplace. And, coming soon, we are creating a workload hub experience in Fabric where users can discover, add, and manage these workloads without ever leaving the Fabric environment. We already have industry-leading partners building on Fabric including SAS, Esri, Informatica, Teradata, and Neo4j.

You can also learn more about the Workload Development Kit by watching the Extend and enhance your analytics applications with Microsoft FabricMicrosoft Build session.

We are also excited to announce two new features, both in preview, created with developers in mind: API for GraphQL and user data functions in Fabric. API for GraphQL is a flexible and powerful RESTful API that allows data professionals to access data from multiple sources in Fabric with a single query API. With API for GraphQL, you can streamline requests to reduce network overheads and accelerate response rates. User data functions are user-defined functions built for Fabric experiences across all data services, such as notebooks, pipelines, or event streams. These features enable developers to build experiences and applications using Fabric data sources more easily like lakehouses, data warehouses, mirrored databases, and more with native code ability, custom logic, and seamless integration. You can watch these features in action in the Introducing API for GraphQL and User Data Functions in Microsoft Fabric Microsoft Build session.

You can also learn more about the Workload Development Kit, the API for GraphQL, user data functions, and more by reading the Integrating ISV apps with Microsoft Fabric blog.

We are also announcing the preview of Data workflows in Fabric as part of the Data Factory experience. Data workflows allow customers to define Directed Acyclic Graphs (DAG) files for complex data workflow orchestration in Fabric. Data workflows is powered by the Apache Airflow runtime and designed to help you author, schedule and monitor workflows or data pipelines using python. Learn more by reading the data workflows blog.

The typical data estate has grown organically over time to span multiple clouds, accounts, databases, domains, and engines with a multitude of vendors and specialized services. OneLake, Fabrics unified, multi-cloud data lake built to span an entire organization, can connect to data from across your data estate and reduce data duplication and sprawl.

We are excited to announce the expansion of OneLake shortcuts to connect to data from on-premises and network-restricted data sources beyond just Azure Data Lake Service Gen2, now in preview. With an on-premises data gateway, you can now create shortcuts to Google Cloud Storage, Amazon S3, and S3 compatible storage buckets that are either on-premises or otherwise network-restricted. To learn more about these announcements, watch the Microsoft Build session Unify your data with OneLake and Microsoft Fabric.

Insights drive impact only when they reach those who can use them to inform actions and decisions. Professional and citizen analysts bridge the gap between data and business results, and with Fabric, they have the tools to quickly manage, analyze, visualize, and uncover insights that can be shared with the entire organization. We are excited to help analysts work even faster and more effectively by releasing the model explorer and the DAX query view in Microsoft Power BI Desktop into general availability.

The model explorer in Microsoft Power BI provides a rich view of all the semantic model objects in the data panehelping you find items in your data fast. You can also use the model explorer to create calculation groups and reduce the number of measures by reusing calculation logic and simplifying semantic model consumption.

The DAX query view in Power BI Desktop lets users discover, analyze, and see the data in their semantic model using the DAX query language. Users working with a model can validate data and measures without having to build a visual or use an additional toolsimilar to the Explore feature. Changes made to measures can be seamlessly updated directly back to the semantic model.

To learn more about these announcements and others coming to Power BI, check out the Power BI blog.

When ChatGPT was launched, it had over 100 million users in just over two monthsthe steepest adoption curve in the history of technology.1 Its been a year and a half since that launch, and organizations are still trying to translate the benefit of generative AI from novelty to actual business results. By infusing generative AI into every layer of Fabric, we can empower your data professionals to employ its benefits, in the right context and in the right scenario to get more done, faster.

Copilot in Fabric was designed to help users unlock the full potential of their data by assisting data professionals to be more productive and business users to explore their data more easily. With Copilot in Fabric, you can use conversational language to create dataflows, generate code and entire functions, build machine learning models, or visualize results. We are excited to share that Copilot in Fabric is now generally available, starting with the Power BI experience. This includes the ability to create stunning reports and summarize your insights into narrative summaries in seconds. Copilot in Fabric is also now enabled on-by-default for all eligible tenants including Copilot in Fabric experiences for Data Factory, Data Engineering, Data Science, Data Warehouse, and Real-Time Intelligence, which are all still in preview. The general availability of Copilot in Fabric for the Power BI experience will be rolling out over the coming weeks to all customers with Power BI Premium capacity (P1 or higher) or Fabric capacity (F64 or higher).

We are also thrilled to announce a new Copilot in Fabric experience for Real-Time Intelligence, currently in preview, that enables users to explore real-time data with ease. Starting with a Kusto Query Language (KQL) Queryset connected to a KQL Database in an Eventhouse or a standalone Azure Data Explorer database, you can type your question in conversational language and Copilot will automatically translate it to a KQL query you can execute. This experience is especially powerful for users less familiar with writing KQL queries but still want to get the most from their time-series data stored in Eventhouse.

We are also thrilled to release a new AI capability in preview called AI skillsan innovative experience designed to provide any user with a conversational Q&A experience about their data. AI skills allow you to simply select the data source in Fabric you want to explore and immediately start asking questions about your dataeven without any configuration. When answering questions, the generative AI experience will show the query it generated to find the answer and you can enhance the Q&A experience by adding more tables, setting additional context, and configuring settings. AI skills can empower everyone to explore data, build and configure AI experiences, and get the answers and insights they need.

AI skills will honor existing security permissions and can be configured to respect the unique language and nuances of your organization, ensuring that responses are not just data-driven but steeped in the context of your business operations. And, coming soon, it can also enrich the creation of new copilots in Microsoft Copilot Studio and be interacted with from Copilot for Microsoft for 365. Its about making your data not just accessible but approachable, inviting users to explore insights through natural dialogue, and shortening the time to insight.

With the launch of Fabric, weve committed to open data formats, standards, and interoperability with our partners to give our customers the flexibility to do what makes sense for their business. We are taking this commitment a step further by expanding our existing partnership with Snowflake to expandinteroperability between Snowflake and Fabrics OneLake. We are excited to announce future support for Apache Iceberg in Fabric OneLake and bi-directional data access between Snowflake and Fabric. This integration will enable users to analyze their Fabric and Snowflake data written in Iceberg format in any engine within either platform, and access data across apps like Microsoft 365, Microsoft Power Platform, and Microsoft Azure AI Studio

With the upcoming availability of shortcuts for Iceberg in OneLake, Fabric users will be able to access all data sources in Iceberg format, including the Iceberg sources from Snowflake, and translate metadata between Iceberg and Delta formats. This means you can work with a single copy of your data across Snowflake and Fabric. Since all the OneLake data can be accessed in Snowflake as well as in Fabric, this integration will enable you to spend less time stitching together applications and your data estate, and more time uncovering insights.

We are also excited to announce we are expanding our existing relationship with Adobe. Adobe Experience Platform (AEP) and Adobe Campaign will have the ability to federate enterprise data from Fabric. Our joint customers will soon have the capability to connect to Fabric and use the Fabric Data Warehouse for query federation to create and enrich audiences for engagement, without having to transfer or extract the data from Fabric.

We are excited to announce that we are expanding the integration between Fabric and Azure Databricksallowing you to have a truly unified experience across both products and pick the right tools for any scenario.

Coming soon, you will be able to access Azure Databricks Unity Catalog tables directly in Fabric, making it even easier to unify Azure Databricks with Fabric. From the Fabric portal, you can create and configure a new Azure Databricks Unity Catalog item in Fabric with just a few clicks. You can add a full catalog, a schema, or even individual tables to link and the management of this Azure Databricks item in OneLakea shortcut connected to Unity Catalogis automatically taken care of for you.

This data acts like any other data in OneLakeyou can write SQL queries or use it with any other workloads in Fabric including Power BI through Direct Lake mode. When the data is modified or tables are added, removed, or renamed in Azure Databricks, the data in Fabric will remain always in sync. This new integration makes it simple to unify Azure Databricks data in Fabric and seamlessly use it across every Fabric workload.

Also coming soon, Fabric users will be able to access Fabric data items like lakehouses as a catalog in Azure Databricks. While the data remains in OneLake, you can access and view data lineage and other metadata in Azure Databricks and leverage the full power of Unity Catalog. This includes extending Unity Catalogs unified governance over data and AI into Azure Databricks Mosaic AI. In total, you will be able to combine this data with other native and federated data in Azure Databricks, perform analysis assisted by generative AI, and publish the aggregated data back to Power BImaking this integration complete across the entire data and AI lifecycle.

Join us at Microsoft Buildfrom May 21 to 23, 2024 to see all of these announcements in action across the following sessions:

You can also try out these new capabilities and everything Fabric has to offer yourself by signing up for a free 60-day trialno credit card information required. To start your free trial, sign up for a free account (Power BI customers can use their existing account), and once signed in, select start trial within the account manager tool in the Fabric app. Existing Power BI Premium customers can already access Fabric by simply turning on Fabric in their Fabric admin portal. Learn more on the Fabric get started page.

We are excited to announce a European Microsoft Fabric Community Conference that will be held in Stockholm, Sweden from September 23 to 26, 2024. You can see firsthand how Fabric and the rest of the data and AI products at Microsoft can help your organization prepare for the era of AI. You will hear from leading Microsoft and community experts from around the world and get hands on experiences with the latest features from Fabric, Power BI, Azure Databases, Azure AI, Microsoft Purview, and more. You will also have the opportunity to learn from top data experts and AI leaders while having the chance to interact with your peers and share your story. We hope you will join usand see how cutting-edge technologies from Microsoft can enable your business success with the power of Fabric.

If you want to learn more about Microsoft Fabric:

Experience the next generation in analytics

1ChatGPT sets record for fastest-growing user base analyst note, Reuters.

Arun Ulagaratchagan

Corporate Vice President, Azure DataMicrosoft

Arun leads product management, engineering, and cloud operations for Azure Data, which includes databases, data integration, big data analytics, messaging, and business intelligence. The products in his teams' portfolio include Azure SQL DB, Azure Cosmos DB, Azure PostgreSQL, Azure MySQL, Azure Data Factory, Azure Synapse Analytics, Azure Service Bus, Azure Event Grid, Power BI, and Microsoft Fabric.

The rest is here:

Unlock real-time insights with AI-powered analytics in Microsoft Fabric - Microsoft

Google Search’s New AI Overviews Will Soon Have Ads – WIRED

Last week Google introduced a radical shake-up of search that presents users with AI-generated answers to their queries. Now the company says it will soon start including ads inside those AI Overviews, as the automatic answers are called.

Google on Tuesday announced plans to test search and shopping ads in the AI summaries, a move that could extend its dominance in search advertising into a new era. Although Google rapidly rolled out AI Overviews to all US English users last week after announcing the feature at its I/O developer conference, its unclear how widely or quickly ads will start appearing.

Screenshots released by Google show how a user asking how to get wrinkles out of clothes might get an AI-generated summary of tips sourced from the web, with a carousel of ads underneath for sprays that purport to help crisp up a wardrobe.

Googles AI Overviews are meant to keep users from shifting to alternatives such as ChatGPT or the startup Perplexity, which use AI-generated text to answer many questions traditionally thrown at Google. How and when Google would integrate ads into AI Overviews has been a significant question over the companys ChatGPT catch-up strategy. Search ads are the company's largest revenue generator, and even subtle changes in ad placements or design can spur big swings in Googles revenue.

Courtesy of Google

Google shared few details about its new Overview ad format in its announcement Tuesday. Ads will have the opportunity to appear within the AI Overview in a section clearly labeled as sponsored when theyre relevant to both the query and the information in the AI Overview, Vidhya Srinivasan, Googles vice president and general Manager for ads, wrote in a blog post.

AI Overview will draw on ads from advertisers existing campaigns, meaning they can neither completely opt out of the experiment nor have to adapt the settings and designs of their ads to appear in the feature. Theres no action needed from advertisers, Srinivasan wrote.

Google said last year when it started experimenting with AI-generated answers in search that ads for specific products would be integrated into the feature. In one example at the time, it showed a sponsored option at the top of an AI-generated list of kids hiking backpacks. Google says the early testing showed that users found ads above and below AI summaries helpful. Googles much smaller rival Bing shows product ads in its Bing Copilot search chatbot, but in tests on Monday, WIRED didnt trigger any ads in Bings competitor to AI Overview.

Follow this link:

Google Search's New AI Overviews Will Soon Have Ads - WIRED

Microsoft’s new Windows Copilot Runtime aims to win over AI developers – The Verge

Microsoft launched a range of Copilot Plus PCs yesterday that includes new AI features built directly into Windows 11. Behind the scenes, the company now has more than 40 AI models running on Windows 11 thanks to a new Windows Copilot Runtime that will also allow developers to use these models for their apps.

At Microsoft Build today, the company is providing a lot more details about exactly how this Windows Copilot Runtime works. The runtime includes a library of APIs that developers can tap into for their own apps, with AI frameworks and toolchains that are designed for developers to ship their own on-device models on Windows.

Windows Copilot Library consists of ready-to-use AI APIs like Studio Effects, Live Captions Translations, OCR, Recall with User Activity, and Phi Silica, which will be available to developers in June, explains Windows and Surface chief Pavan Davuluri.

Developers will be able to use the Windows Copilot Library to integrate things like Studio Effects, filters, portrait blur, and other features into their apps. Meta is adding the Windows Studio Effects into WhatsApp, so youll get features like background blur and eye contact during video calls. Even Live Captions and the new AI-powered translation feature can be used by developers with little to no code.

Microsoft demonstrated its Recall AI feature yesterday, allowing Copilot Plus PCs to document and store everything that you do on your PC so you can recall memories and search through a timeline. This is all powered by a new Windows Semantic Index that stores this data locally, and Microsoft plans to allow developers to build something similar.

We will make this capability available for developers with Vector Embeddings API to build their own vector store and RAG within their applications and with their app data, says Davuluri.

Photo: Allison Johnson / The Verge

Developers will also be able to improve Windows new Recall feature by adding contextual information to their apps that feeds into the database powering this feature. This integration helps users pick up where they left off in your app, improving app engagement and users seamless flow between Windows and your app, says Davuluri.

All of these improvements inside Windows for developers are the very early building blocks for more AI-powered apps on top of its new Arm-powered systems and the NPUs coming from AMD and Intel soon. While Microsoft is building the platform for developers to create AI apps for Windows, its now banking on this being an important part of the next decade of Windows development. Onstage at Build today, Davuluri stood in front of a slide that read Windows is the most open platform for AI, signaling just how important this moment is for Microsoft.

Continued here:

Microsoft's new Windows Copilot Runtime aims to win over AI developers - The Verge

This is how I plan to explain AI PC to my confused friends and relatives – TechRadar

It occurred to me this morning that I will soon be explaining to a friend or relative what an AI PC is and what theyre meant to do with it.

The answer seems obvious to me because Ive been covering AI and PCs for decades. But as I try to articulate the meaning, I stumble:

None of that comes close to capturing it. What makes more sense is this:

A PC that works the way they promised it would when we first started computing.

Instead of a dense box full of information, memories, and apps that can go through it all, its a wonder box that anticipates your intentions, takes actions on your behalf, and never leaves you wondering, How do I do that?

Granted the AI PCs youll see this summer are still not quite that. However, there will be hints of that power and potential.

Microsofts one-button Copilot access across the new Surface Windows PCs it builds, and myriad partner laptops and desktops are not just marketing stunts. The Copilot button might initially be considered a when all else fails button. You hit it, and Copilot might rescue you because it lets you ask your question in a way that makes sense to you. An AI PC will know itself as you know yourself. It will know more about the computes inner workings, settings, and AI-compliant apps than you do and might not make you wade through apps, settings, and menus to get results.

Sign up for breaking news, reviews, opinion, top tech deals, and more.

Any applications menu system is a developers best guess at the intentions of millions of users, and when you try to satisfy everyone, you usually satisfy no one.

AI-integrated machines will outstrip the rudimentary intelligence of your average PC and apps with something approaching human reasoning. This act could make your PC like the digital partner you always wanted. Unlike tiny AI-infused gadgets like Rabbit R1 and Humane AI, they wont insist you learn a new usage paradigm. These AI PCs look like your old PCs, which means you use them as you want, in whatever way makes you happy, and tap into that new AI superpower on an as-needed basis.

If my friends and relatives also ask about how the PC can be so smart, where all that intelligence resides, and if every question they ask ends up in the hands of a third party, thats when the conversation might get a little more complicated.

Breaking down this complex issue, Id explain that most AI PCs will take a half-and-half approach. Some intelligence will be right there, in the brand new AI Brain or NPU, but the rest could reside on cloud servers owned by Microsoft, Apple, or even Google. Choosing your new AI PC will come down to who you trust to keep your queries private.

Im also pretty sure this explanation will hold up next month when Apple introduces its own AI Macs (theyre also PCs, by the way)

Yeah, this is what Ill say if someone asks me.

Read this article:

This is how I plan to explain AI PC to my confused friends and relatives - TechRadar

Releasing a new paper on openness and artificial intelligence – Mozilla & Firefox

For the past six months, the Columbia Institute of Global Politics and Mozilla have been working with leading AI scholars and practitioners to create a framework on openness and AI. Today, we are publishing a paper that lays out this new framework.

During earlier eras of the internet, open source technologies played a core role in promoting innovation and safety. Open source technology provided a core set of building blocks that software developers have used to do everything from create art to design vaccines to develop apps that are used by people all over the world; it is estimated that open source software is worth over $8 trillion in value. And, attempts to limit open innovation such as export controls on encryption in early web browsers ended up being counterproductive, further exemplifying the value of openness.

The paper surveys existing approaches to defining openness in AI models and systems, and then proposes a descriptive framework to understand how each component of the foundation model stack contributes to openness.

Today, open source approaches for artificial intelligence and especially for foundation models offer the promise of similar benefits to society. However, defining and empowering open source for foundation models has proven tricky, given its significant differences from traditional software development. This lack of clarity has made it harder to recommend specific approaches and standards for how developers should advance openness and unlock its benefits. Additionally, these conversations about openness in AI have often operated at a high level, making it harder to reason about the benefits and risks from openness in AI. Some policymakers and advocates have blamed open access to AI as the source of certain safety and security risks, often without concrete or rigorous evidence to justify those claims. On the other hand, people often tout the benefits of openness in AI, but without specificity about how to actually harness those opportunities.

Thats why, in February, Mozilla and the Columbia Institute of Global Politics brought together over 40 leading scholars and practitioners working on openness and AI for the Columbia Convening. These individuals spanning prominent open source AI startups and companies, nonprofit AI labs, and civil society organizations focused on exploring what open should mean in the AI era.

Today, we are publishing a paper that presents a framework for grappling with openness across the AI stack. The paper surveys existing approaches to defining openness in AI models and systems, and then proposes a descriptive framework to understand how each component of the foundationmodel stack contributes to openness. It enables without prescribing an analysis of how to unlock specific benefits from AI, based on desired model and system attributes. Furthermore, the paper also adds clarity to support further work on this topic, including work to develop stronger safety safeguards for open systems.

We believe this framework will support timely conversations around the technical and policy communities. For example, this week, as policymakers discuss AI policy at the AI Seoul Summit 2024, this framework can help clarify how openness in AI can support societal and political goals, including innovation, safety, competition, and human rights. And, as the technical community continues to build and deploy AI systems, this framework can support AI developers in ensuring their AI systems help achieve their intended goals, promote innovation and collaboration, and reduce harms. We look forward to working with the open source and AI community, as well as the policy and technical communities more broadly, to continue building on this framework going forward.

Read more from the original source:

Releasing a new paper on openness and artificial intelligence - Mozilla & Firefox