Category: News

  • Insights from the Secure Employee Access report reveal the need for unified access security

    Insights from the Secure Employee Access report reveal the need for unified access security

    The evolving landscape of workplace security is becoming increasingly complex, with the Secure Employee Access in the Age of AI report shedding light on some critical insights. Traditional security measures are falling short as organizations transition to more dynamic work environments. According to the report, a vast majority of security leaders, about 98%, stress the necessity for closer collaboration between identity and network teams to enhance security protocols and efficiency. As cloud adoption and hybrid work patterns rise, this report emphasizes the vital need for a unified approach, particularly in an era where AI is becoming a fundamental part of business operations.

    Among the key challenges identified are the increasing number of identities and applications, which complicates security logistics as employees demand seamless access to essential apps. The complexities of hybrid work also introduce new attack vectors, with 61% of security leaders observing a rise in incidents linked to these models. The rapid adoption of AI is another significant concern, with 57% reporting more security incidents, emphasizing the pressing need for robust access controls in AI environments. Security leaders are advocating for a more streamlined strategy to minimize risk — prioritizing collaboration, consolidation of disparate tools, and unifying identity and network access.

    Essentially, this report is a call to action. It suggests a clear roadmap towards securing employee access, moving beyond fragmented strategies to a unified approach that bolsters protection, organizational efficiency, and user experience. By simplifying and harmonizing access management, organizations can not only minimize breaches but also enhance productivity and adapt to future network security requirements in the age of AI.

    News: Insights from the Secure Employee Access report
    Documentation: Microsoft Entra documentation | Microsoft Learn

  • AI Procurement Assistant: Streamlining Procurement with Logic Apps and Prompt Templates

    AI Procurement Assistant: Streamlining Procurement with Logic Apps and Prompt Templates

    Create an efficient AI procurement assistant using the new Chat Completions with Prompt Template action in Logic Apps (Standard). This innovation allows organizations to automate procurement-related inquiries with ease. Imagine a scenario where an employee needs information on the last order of laptops for IT. Instead of directing this query to the procurement team, the AI assistant accepts the question, looks up relevant data, and generates a polished, AI-powered response using the information from previous orders and product catalogs. This reduces manual workload and increases efficiency.

    Prompt Templates, utilizing Jinja2 syntax, streamline the process by dynamically injecting data at runtime, making the solution consistent and easily maintainable. Benefits include centralized prompt logic, reusability across workflows, and dynamic control with inputs directly from Logic Apps. With simple steps, anyone with a Logic App resource in Azure can build this flow, leveraging Compose actions to store and utilize test data, connecting seamlessly with Azure OpenAI resources for powerful automation in enterprise environments. By configuring specific prompt template variables, Logic Apps effectively creates an adaptable, AI-driven flow, ensuring accurate outputs directly responsive to data inputs.

    News: AI Procurement Assistant Using Prompt Templates in Standard Logic Apps
    Documentation: Azure Logic Apps Documentation

  • Cyber Signals Issue 9 | AI-powered deception: Emerging fraud threats and countermeasures

    Cyber Signals Issue 9 | AI-powered deception: Emerging fraud threats and countermeasures

    In the latest edition of Cyber Signals, Microsoft delves into the evolving landscape of AI-powered fraud, revealing significant efforts to combat cyber threats. Between April 2024 and April 2025, Microsoft thwarted $4 billion in fraud attempts, rejected 49,000 fraudulent partnership enrollments, and blocked approximately 1.6 million bot signup attempts per hour. The report sheds light on how AI is being exploited to lower the technical skill required for fraudulent activities, enabling criminals to create believable content swiftly. This shift is making sophisticated scams more accessible and widespread, particularly in countries like China and Germany.

    One of the key areas of concern highlighted is fraudulent e-commerce websites. Scammers leverage AI tools to mimic legitimate websites convincingly, trapping unsuspecting customers. Similarly, the use of AI in job scams is increasing, with criminals crafting phony listings and conducting AI-powered interviews to deceive job seekers. Microsoft outlines various strategies to protect consumers, including smarter authentication protocols and deepfake detection. Additionally, they emphasize the critical role of public awareness alongside robust technology in combating these threats.

    Tech support scams, though not new, pose continuing risks. Microsoft observed scams involving its Quick Assist tool being abused by cybercriminals to gain unauthorized access to devices. However, through advanced security measures and collaboration with global partners, Microsoft states its commitment to creating “Fraud-resistant by Design” products and services. This encompasses leveraging AI to both detect and prevent threats in real-time, as well as initiatives urging tech companies, public awareness, and law enforcement to collaborate for a safer digital ecosystem.

    News: Cyber Signals Issue 9 | AI-powered deception: Emerging fraud threats and countermeasures
    Documentation: Domain impersonation protection

  • Blog Post from Microsoft Azure: I Built a Bot to Chat with Our Team’s Wiki Using Azure OpenAI Service

    Blog Post from Microsoft Azure: I Built a Bot to Chat with Our Team’s Wiki Using Azure OpenAI Service

    In a recent personal project, a Microsoft employee explored the potential of Azure OpenAI Service to transform their team’s cumbersome Wiki into an easily navigable resource. The enormous size of their internal Wiki, a common feature of projects that have been around for years, posed significant challenges to finding relevant information quickly. To tackle this, the author leveraged Azure’s “OpenAI on Your Data,” which allows users to execute advanced AI models like GPT-4 on their business data without requiring model training or fine-tuning. Although originating as an internal project, this idea not only increases efficiency but also potentially broadens its application scope thanks to its scalable and adaptable nature.

    The implementation process started with a tutorial to establish the core functionality using Azure services such as Blob Storage, Azure AI Search, and the Azure OpenAI Service. A key development was making the bot dynamic through the use of an Indexer to accommodate updates within the Wiki. Moreover, to automate data parsing from Azure DevOps Wiki, a Python script was employed, while custom metadata indexing provided users with more contextual responses. Challenges arose with SDK coverage and integrating Managed Identity, but those hurdles were tackled with ingenuity and determination. The entire setup highlights the capability of AI in enhancing access to information and underscores ongoing improvement areas, such as securing the infrastructure against unauthorized access.

    News: I Built a Bot to Chat with Our Team’s Wiki Using Azure OpenAI Service
    Documentation: Azure Open AI on Your Data

  • Azure Files: More Performance, More Control, More Value for Your File Data

    Azure Files: More Performance, More Control, More Value for Your File Data

    Azure is stepping up its game with significant enhancements to Azure Files, aiming to provide businesses with more performance, control, and value for their data. Among the key updates is the introduction of the Provisioned v2 billing model for HDD (standard) Azure Files, offering predictable and budget-friendly pricing. This model allows organizations to pay for specific storage capacities, IOPS, and throughput, ensuring an optimal balance of performance and cost efficiency. Additionally, the performance limits have been expanded, allowing up to 256 TiB of maximum share size, 50,000 IOPS, and 5 GiB/sec of throughput, which simplifies workload management and obviates complicated workarounds.

    Furthermore, Azure addresses performance bottlenecks with metadata caching for SSD (premium) workloads, like AI/ML processes handled on Azure Kubernetes Service (AKS). This innovation significantly reduces latency, potentially improving metadata IOPS and throughput by up to threefold. The feature has already proven beneficial to companies like Suncor Energy. Additionally, Azure File Sync now supports Windows Server 2025 and provides an AI-powered assistant called Copilot, capable of analyzing environments, resolving issues, and optimizing storage costs.

    Enhanced security is also in place with Managed Identities support for Azure File Sync, bolstering authentication protocols and simplifying credential management. Vaulted backup solutions further safeguard data, offering protection against threats such as ransomware and accidental deletion. Azure’s migration solutions guarantee seamless transitions from various environments to its platform, backed by a suite of industry-leading migration tools provided at no extra cost.

    News: Azure Files: More performance, more control, more value for your file data
    Documentation: Azure Files Introduction

  • Private Preview: Azure Backup for AKS Now Supports Azure File Share-based Persistent Volumes

    Private Preview: Azure Backup for AKS Now Supports Azure File Share-based Persistent Volumes

    Microsoft has unveiled a major enhancement in Azure Backup for Azure Kubernetes Service (AKS) that now supports Azure File Share-based Persistent Volumes. This feature is currently in Private Preview and aims to bring more flexibility and reliability to how users manage data within AKS applications. With this upgrade, users can utilize snapshot-based backups, extending the previous support for Azure Disks to now include Azure File Shares. This development is particularly beneficial for organizations seeking efficient and comprehensive data protection solutions in their cloud-native environments, allowing for seamless backups and restores without having to resort to third-party tools.

    With the integration of Azure File Share-based Persistent Volumes in AKS, clients have more options to tailor their storage strategies according to their specific needs. These snapshots are aimed at improving data recovery times and minimizing potential data loss by securing states of the file systems at regular intervals. As the capability is in Private Preview, interested users can keep a close eye on Azure updates for broader availability and general release details.

    News: Azure Backup for AKS supports Azure File Share-based Persistent Volumes
    Documentation: Azure Backup Documentation

  • Using Security Copilot to Proactively Identify and Prioritize Vulnerabilities

    Using Security Copilot to Proactively Identify and Prioritize Vulnerabilities

    In the ever-evolving landscape of cybersecurity, the ability to proactively identify and prioritize vulnerabilities is more crucial than ever. This is where tools like Security Copilot come into play, offering organizations valuable insights into potential threats so they can address them before they escalate. A critical feed in this realm is provided by CISA’s Known Exploited Vulnerabilities (KEV) Catalog. This meticulously maintained list flags vulnerabilities that are actively exploited, providing essential details and mitigation guidance for cybersecurity professionals to fortify their defenses efficiently.

    To streamline this process, the CISA feed is seamlessly integrated with Microsoft Defender for Endpoint through an automated workflow using Logic Apps. By querying the latest CVE findings, this setup enables a targeted vulnerability assessment across devices, offering enriched descriptions and actionable remediation steps to security analysts. Notifications via email ensure all relevant stakeholders remain instantly informed, thus ensuring a coordinated, proactive approach to vulnerability management.

    This method of leveraging CISA’s feed is just one of the many strategies available. Complementary technologies such as Function Apps and AI-driven solutions from the Security Copilot GitHub repository offer further automation capabilities and insights through machine learning and natural language processing, allowing organizations to enhance their decision-making processes and security operations holistically.

    News: Using Security Copilot to Proactively Identify and Prioritize Vulnerabilities
    Documentation: Microsoft Defender Vulnerability Management

  • Public Preview: Remote Model Context Protocol (MCP) Support in Azure Functions

    Public Preview: Remote Model Context Protocol (MCP) Support in Azure Functions

    Microsoft has introduced a public preview of the Remote Model Context Protocol (MCP) support within Azure Functions. MCP is designed to enhance the interaction between applications and large language models by enabling apps to deliver their capabilities and context more effectively. A significant aspect of MCP is its tool-defining feature, which AI agents utilize to carry out assigned tasks efficiently. This enhancement opens up new possibilities for developers looking to leverage Azure for more contextual and capable AI-driven applications. It allows for a more robust communication framework between the model and the application seeking AI assistance.

    The introduction of MCP support in Azure Functions marks an important milestone for developers aiming to streamline AI interactions. By facilitating better-defined contexts and resources that applications can pass to language models, MCP empowers AI solutions to become more dynamic and adaptable in their response and task execution. Developers interested in exploring this public preview can now start integrating these advanced AI communication capabilities into their projects.

    News: Public Preview: Remote Model Context Protocol (MCP) support in Azure Functions
    Documentation: Azure Functions Documentation

  • Fabric Espresso: Optimizing Performance and Compute Management in Microsoft Fabric

    Fabric Espresso: Optimizing Performance and Compute Management in Microsoft Fabric

    Microsoft’s Fabric Espresso series delves into the intricacies of performance optimization and compute management within Microsoft Fabric. This enlightening series aims to provide developers and IT professionals with invaluable insights and strategies related to managing and enhancing performance in cloud environments. By tuning into these episodes, viewers can gain a deeper understanding of how to effectively operate within the Microsoft Fabric ecosystem, ensuring optimal resource utilization and improved scalability.

    The series breaks down complex topics into digestible segments, allowing both new and seasoned professionals to align their approaches with industry standards and practices. With a focus on practical application, the episodes encourage a hands-on approach, empowering users to implement the discussed techniques confidently. For anyone looking to leverage Microsoft Fabric to its full potential, this series is a fantastic resource filled with expert advice and innovation-driven solutions.

    News: Fabric Espresso – Episodes about Performance Optimization & Compute Management in Microsoft Fabric
    Documentation: Microsoft Fabric Documentation

  • Blog Post from Thomas Thornton: Why GitHub Copilot custom instructions matter

    Blog Post from Thomas Thornton: Why GitHub Copilot custom instructions matter

    Using GitHub Copilot has become a daily routine for many developers as it is integrated into various IDEs, making coding more efficient. However, while it’s a powerful tool, there are times when Copilot suggests code that is almost correct but doesn’t quite meet the stylistic or procedural norms of your team. This can be slightly frustrating when its suggestions deviate from expected naming conventions or your team’s coding standards. The introduction of custom instructions in GitHub Copilot is a game-changer in this aspect. It allows developers to guide the AI more effectively to align with their coding style and processes, thus bridging any gaps that previously existed.

    The blog post by Thomas Thornton offers insights into how these custom instructions can transform the way developers interact with Copilot, ensuring that the AI assistant fits more snugly within the team dynamics and personal coding preferences. Notably, it highlights the importance of these tailored instructions in creating a seamless coding environment that reduces errors and enhances productivity. Custom instructions enable developers to set parameters and specific guidelines for the AI, which can be particularly beneficial for maintaining consistency within long-term projects or across different team collaborations.

    News: https://thomasthornton.cloud/2025/04/16/why-github-copilot-custom-instructions-matter/
    Documentation: https://docs.github.com/en/copilot