Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
53 changes: 53 additions & 0 deletions Samples/AppContentSearch/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
---
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we put this folder under Samples/WindowsAIFoundry?
Since that's how I discover Windows AI related feature.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would I be moving the other sample under Samples/WindowsAIFoundry/cs-winui in its own directory and placing the AppContentSearch sample in that same folder?

page_type: sample
languages:
- csharp
products:
- windows
- windows-app-sdk
name: "AppContentSearch Sample"
urlFragment: AppContentSearchSample
description: "Demonstrates how to use the AppContentSearch APIs in Windows App SDK to index and semantically search text content and images in a WinUI3 notes application."
extendedZipContent:
- path: LICENSE
target: LICENSE
---


# AppContentSearch Sample Application

This sample demonstrates how to use App Content Search's **AppContentIndex APIs** in a **WinUI3** notes application. It shows how to create, manage, and semantically search through the index that includes both text content and images. It also shows how to use use the search results to enable retrieval augmented genaration (RAG) scenarios with language models.

> **Note**: This sample is targeted and tested for **Windows App SDK 2.0 Experimental2** and **Visual Studio 2022**. The AppContentSearch APIs are experimental and available in Windows App SDK 2.0 experimental2.


## Features

This sample demonstrates:

- **Creating Index**: Create an index with optional settings.
- **Indexing Content**: Add, update, and remove content from the index
- **Text Content Search**: Query the index for text-based results.
- **Image Content Search**: Query the index for image-based results.
- **Search Results Display**: Display both text and image search results with relevance highlighting and bounding boxes for image matches
- **Retrieval Augmented Generation (RAG)**: Use query search results with language models for retrieval augmented generation (RAG) scenarios.


## Prerequisites

* See [System requirements for Windows app development](https://docs.microsoft.com/windows/apps/windows-app-sdk/system-requirements).
* Make sure that your development environment is set up correctly—see [Install tools for developing apps for Windows 10 and Windows 11](https://docs.microsoft.com/windows/apps/windows-app-sdk/set-up-your-development-environment).
* This sample requires Visual Studio 2022 and .NET 9.


## Building and Running the Sample

* Open the solution file (`AppContentSearch.sln`) in Visual Studio.
* Press Ctrl+Shift+B, or select **Build** \> **Build Solution**.
* Run the application to see the Notes app with integrated search functionality.


## Related Documentation and Code Samples

* [Windows App SDK](https://docs.microsoft.com/windows/apps/windows-app-sdk/)
* [AppContentSearch API Documentation](https://learn.microsoft.com/en-us/windows/ai/apis/app-content-search)
242 changes: 242 additions & 0 deletions Samples/AppContentSearch/cs-winui/AI/IChatClient/PhiSilicaClient.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,242 @@
// Copyright (c) Microsoft Corporation. All rights reserved.

using Microsoft.Extensions.AI;
using Microsoft.Windows.AI.ContentSafety;
using Microsoft.Windows.AI.Text;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Runtime.CompilerServices;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using Windows.Foundation;

namespace Notes.AI;

internal class PhiSilicaClient : IChatClient
{
// Search Options
private const int DefaultTopK = 50;
private const float DefaultTopP = 0.9f;
private const float DefaultTemperature = 1;

private LanguageModel? _languageModel;

public ChatClientMetadata Metadata { get; }

private PhiSilicaClient()
{
Metadata = new ChatClientMetadata("PhiSilica", new Uri($"file:///PhiSilica"));
}

private static ChatOptions GetDefaultChatOptions()
{
return new ChatOptions
{
Temperature = DefaultTemperature,
TopP = DefaultTopP,
TopK = DefaultTopK,
};
}

public static async Task<PhiSilicaClient?> CreateAsync(CancellationToken cancellationToken = default)
{
#pragma warning disable CA2000 // Dispose objects before losing scope
var phiSilicaClient = new PhiSilicaClient();
#pragma warning restore CA2000 // Dispose objects before losing scope

try
{
await phiSilicaClient.InitializeAsync(cancellationToken);
}
catch
{
return null;
}

return phiSilicaClient;
}

public Task<ChatResponse> GetResponseAsync(IList<ChatMessage> chatMessages, ChatOptions? options = null, CancellationToken cancellationToken = default) =>
GetStreamingResponseAsync(chatMessages, options, cancellationToken).ToChatResponseAsync(cancellationToken: cancellationToken);

public async IAsyncEnumerable<ChatResponseUpdate> GetStreamingResponseAsync(IList<ChatMessage> chatMessages, ChatOptions? options = null, [EnumeratorCancellation] CancellationToken cancellationToken = default)
{
if (_languageModel == null)
{
throw new InvalidOperationException("Language model is not loaded.");
}

string prompt = GetPromptAsString(chatMessages);

await foreach (var part in GenerateStreamResponseAsync(prompt, options, cancellationToken))
{
yield return new ChatResponseUpdate
{
Role = ChatRole.Assistant,
Text = part,
};
}
}

private (LanguageModelOptions? ModelOptions, ContentFilterOptions? FilterOptions) GetModelOptions(ChatOptions options)
{
if (options == null)
{
return (null, null);
}

var languageModelOptions = new LanguageModelOptions
{
Temperature = options.Temperature ?? DefaultTemperature,
TopK = (uint)(options.TopK ?? DefaultTopK),
TopP = (uint)(options.TopP ?? DefaultTopP),
};

var contentFilterOptions = new ContentFilterOptions();

if (options?.AdditionalProperties?.TryGetValue("input_moderation", out SeverityLevel inputModeration) == true && inputModeration != SeverityLevel.Low)
{
contentFilterOptions.PromptMaxAllowedSeverityLevel = new TextContentFilterSeverity
{
Hate = inputModeration,
Sexual = inputModeration,
Violent = inputModeration,
SelfHarm = inputModeration
};
}

if (options?.AdditionalProperties?.TryGetValue("output_moderation", out SeverityLevel outputModeration) == true && outputModeration != SeverityLevel.Low)
{
contentFilterOptions.ResponseMaxAllowedSeverityLevel = new TextContentFilterSeverity
{
Hate = outputModeration,
Sexual = outputModeration,
Violent = outputModeration,
SelfHarm = outputModeration
};
}

return (languageModelOptions, contentFilterOptions);
}

private string GetPromptAsString(IEnumerable<ChatMessage> chatHistory)
{
if (!chatHistory.Any())
{
return string.Empty;
}

StringBuilder prompt = new StringBuilder();

for (var i = 0; i < chatHistory.Count(); i++)
{
var message = chatHistory.ElementAt(i);

if (!string.IsNullOrEmpty(message.Text))
{
prompt.AppendLine(message.Text);
}
}

return prompt.ToString();
}

public void Dispose()
{
_languageModel?.Dispose();
_languageModel = null;
}

public object? GetService(Type serviceType, object? serviceKey = null)
{
return
serviceKey is not null ? null :
_languageModel is not null && serviceType?.IsInstanceOfType(_languageModel) is true ? _languageModel :
serviceType?.IsInstanceOfType(this) is true ? this :
serviceType?.IsInstanceOfType(typeof(ChatOptions)) is true ? GetDefaultChatOptions() :
null;
}

public static bool IsAvailable()
{
try
{
return LanguageModel.GetReadyState() == Microsoft.Windows.AI.AIFeatureReadyState.Ready;
}
catch
{
return false;
}
}

private async Task InitializeAsync(CancellationToken cancellationToken = default)
{
cancellationToken.ThrowIfCancellationRequested();

if (!IsAvailable())
{
await LanguageModel.EnsureReadyAsync();
}

cancellationToken.ThrowIfCancellationRequested();

_languageModel = await LanguageModel.CreateAsync();
}

#pragma warning disable IDE0060 // Remove unused parameter
public async IAsyncEnumerable<string> GenerateStreamResponseAsync(string prompt, ChatOptions? options = null, [EnumeratorCancellation] CancellationToken cancellationToken = default)
#pragma warning restore IDE0060 // Remove unused parameter
{
if (_languageModel == null)
{
throw new InvalidOperationException("Language model is not loaded.");
}

string currentResponse = string.Empty;
using var newPartEvent = new ManualResetEventSlim(false);

IAsyncOperationWithProgress<LanguageModelResponseResult, string>? progress;
if (options == null)
{
progress = _languageModel.GenerateResponseAsync(prompt, new LanguageModelOptions());
}
else
{
var (modelOptions, filterOptions) = GetModelOptions(options);
progress = _languageModel.GenerateResponseAsync(prompt, modelOptions);
}

progress.Progress = (result, value) =>
{
currentResponse = value;
newPartEvent.Set();
if (cancellationToken.IsCancellationRequested)
{
progress.Cancel();
}
};

while (progress.Status != AsyncStatus.Completed)
{
await Task.CompletedTask.ConfigureAwait(ConfigureAwaitOptions.ForceYielding);

if (newPartEvent.Wait(10, cancellationToken))
{
yield return currentResponse;
newPartEvent.Reset();
}
}

var response = await progress;

yield return response?.Status switch
{
LanguageModelResponseStatus.BlockedByPolicy => "\nBlocked by policy",
LanguageModelResponseStatus.PromptBlockedByContentModeration => "\nPrompt blocked by content moderation",
LanguageModelResponseStatus.ResponseBlockedByContentModeration => "\nResponse blocked by content moderation",
_ => string.Empty,
};
}
}
Loading