Created
October 18, 2025 00:20
-
-
Save justinchuby/ee171bef4c6ebaf7e020acc362a8a1ad to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| 2025-10-17 17:18:33.408 -07:00 [INF] Command:ModelInit Status:Success Direct:True Time:0ms | |
| 2025-10-17 17:18:33.861 -07:00 [INF] UserAgent:foundry-local-CLI/0.7.120+3b92ed4014 Command:ListDownloadedModels Status:Success Direct:True Time:5ms | |
| 2025-10-17 17:18:33.861 -07:00 [INF] Command:ModelDownload Status:Skipped Direct:False Time:1698ms | |
| 2025-10-17 17:18:33.861 -07:00 [INF] Command:ServiceStart Status:Skipped Direct:False Time:0ms | |
| 2025-10-17 17:18:34.234 -07:00 [INF] Command:ServiceList Status:Success Direct:False Time:372ms | |
| 2025-10-17 17:18:34.234 -07:00 [INF] Loading model: http://127.0.0.1:56051/openai/load/gemma-3-1b-it_ort?ttl=600 | |
| 2025-10-17 17:18:34.234 -07:00 [INF] UserAgent:foundry-local-CLI/0.7.120+3b92ed4014 Command:ListLoadedModels Status:Success Direct:True Time:0ms | |
| 2025-10-17 17:18:34.685 -07:00 [INF] Loading model:gemma-3-1b-it_ort | |
| 2025-10-17 17:18:40.704 -07:00 [INF] Finish loading model:gemma-3-1b-it_ort elapsed time:00:00:06.0190199 | |
| 2025-10-17 17:18:40.704 -07:00 [INF] UserAgent:foundry-local-CLI/0.7.120+3b92ed4014 Command:ModelLoad Status:Success Direct:True Time:6019ms | |
| 2025-10-17 17:18:40.705 -07:00 [INF] 🟢 Model gemma-3-1b-it_ort loaded successfully | |
| 2025-10-17 17:18:40.705 -07:00 [INF] Command:ModelLoad Status:Success Direct:False Time:6470ms | |
| 2025-10-17 17:18:40.706 -07:00 [INF] Accepting pipe incoming pipeName:inference_agent numOfSession:3 | |
| 2025-10-17 17:18:42.786 -07:00 [INF] HandleChatCompletionAsStreamRequest -> model:gemma-3-1b-it_ort MaxCompletionTokens:2048 maxTokens:(null) temperature:(null) topP:(null) | |
| 2025-10-17 17:18:42.786 -07:00 [INF] UserAgent:FoundryLocal OpenAI/2.2.0-beta.4 (.NET 9.0.8; Darwin 25.0.0 Darwin Kernel Version 25.0.0: Wed Sep 17 21:41:45 PDT 2025; root:xnu-12377.1.9~141/RELEASE_ARM64_T6000) Command:OpenAIChatCompletions Status:Success Direct:True Time:1ms | |
| 2025-10-17 17:18:42.787 -07:00 [INF] HandleChatCompletionAsStreamRequest -> model:gemma-3-1b-it_ort MaxCompletionTokens:2048 maxTokens:(null) temperature:(null) topP:(null) | |
| 2025-10-17 17:18:42.788 -07:00 [ERR] Connection id "0HNGDRFFE7H2H", Request id "0HNGDRFFE7H2H:00000001": An unhandled exception was thrown by the application. | |
| 2025-10-17 17:18:42.790 -07:00 [INF] Command:SessionInteractive Status:Failure Direct:False Time:2085ms | |
| 2025-10-17 17:18:42.790 -07:00 [INF] Command:ModelRun Status:Failure Direct:True Time:10627ms | |
| 2025-10-17 17:18:42.791 -07:00 [INF] Stream disconnected | |
| 2025-10-17 17:18:42.795 -07:00 [INF] LogException | |
| Microsoft.AI.Foundry.Local.Common.FLException: Error during chat | |
| ---> System.Net.Http.HttpIOException: The response ended prematurely. (ResponseEnded) | |
| at System.Net.Http.HttpConnection.<FillAsync>d__82.MoveNext() + 0x2b0 | |
| --- End of stack trace from previous location --- | |
| at System.Net.Http.HttpConnection.ChunkedEncodingReadStream.<ReadAsyncCore>d__6.MoveNext() + 0x280 | |
| --- End of stack trace from previous location --- | |
| at System.ClientModel.Internal.ReadTimeoutStream.<ReadAsync>d__21.MoveNext() + 0x2b8 | |
| --- End of stack trace from previous location --- | |
| at System.Net.ServerSentEvents.SseParser`1.<FillLineBufferAsync>d__35.MoveNext() + 0x1d0 | |
| --- End of stack trace from previous location --- | |
| at System.Net.ServerSentEvents.SseParser`1.<EnumerateAsync>d__20.MoveNext() + 0x258 | |
| --- End of stack trace from previous location --- | |
| at OpenAI.AsyncSseUpdateCollection`1.AsyncSseUpdateEnumerator`1.<System-Collections-Generic-IAsyncEnumerator<U>-MoveNextAsync>d__12.MoveNext() + 0x1ec | |
| --- End of stack trace from previous location --- | |
| at OpenAI.AsyncSseUpdateCollection`1.<GetValuesFromPageAsync>d__8.MoveNext() + 0x264 | |
| --- End of stack trace from previous location --- | |
| at OpenAI.AsyncSseUpdateCollection`1.<GetValuesFromPageAsync>d__8.MoveNext() + 0x3d8 | |
| --- End of stack trace from previous location --- | |
| at System.ClientModel.AsyncCollectionResult`1.<GetAsyncEnumerator>d__1.MoveNext() + 0x2f8 | |
| --- End of stack trace from previous location --- | |
| at System.ClientModel.AsyncCollectionResult`1.<GetAsyncEnumerator>d__1.MoveNext() + 0x6a0 | |
| --- End of stack trace from previous location --- | |
| at System.ClientModel.AsyncCollectionResult`1.<GetAsyncEnumerator>d__1.MoveNext() + 0x7e0 | |
| --- End of stack trace from previous location --- | |
| at Microsoft.AI.Foundry.Local.Commands.ModelRunCommand.<InteractiveNewRoundAsync>d__7.MoveNext() + 0x2ac | |
| --- End of stack trace from previous location --- | |
| at Microsoft.AI.Foundry.Local.Commands.ModelRunCommand.<InteractiveNewRoundAsync>d__7.MoveNext() + 0x680 | |
| --- End of inner exception stack trace --- | |
| at Microsoft.AI.Foundry.Local.Commands.ModelRunCommand.HandleExceptionDuringChat(Exception) + 0x108 | |
| at Microsoft.AI.Foundry.Local.Commands.ModelRunCommand.<InteractiveNewRoundAsync>d__7.MoveNext() + 0x8e8 | |
| --- End of stack trace from previous location --- | |
| at Microsoft.AI.Foundry.Local.Commands.ModelRunCommand.<RunInteractiveNewRoundAsync>d__3.MoveNext() + 0x60 | |
| --- End of stack trace from previous location --- | |
| at Microsoft.AI.Foundry.Local.Commands.ModelRunCommand.<<Create>b__1_0>d.MoveNext() + 0x15f4 | |
| --- End of stack trace from previous location --- | |
| at Microsoft.AI.Foundry.Local.Common.CommandActionFactory.<>c__DisplayClass0_0`1.<<Create>b__0>d.MoveNext() + 0x25c | |
| --- End of stack trace from previous location --- | |
| at System.CommandLine.NamingConventionBinder.CommandHandler.<GetExitCodeAsync>d__66.MoveNext() + 0x58 | |
| --- End of stack trace from previous location --- | |
| at System.CommandLine.NamingConventionBinder.ModelBindingCommandHandler.<InvokeAsync>d__11.MoveNext() + 0x68 | |
| --- End of stack trace from previous location --- | |
| at System.CommandLine.Invocation.InvocationPipeline.<InvokeAsync>d__0.MoveNext() + 0x1e4 | |
| --- End of stack trace from previous location --- | |
| at Microsoft.AI.Foundry.Local.Program.<Main>d__1.MoveNext() + 0x4e8 | |
| 2025-10-17 17:18:48.960 -07:00 [INF] Starting Foundry Local CLI with 'service status' | |
| 2025-10-17 17:18:49.027 -07:00 [INF] Found service endpoints: http://127.0.0.1:56051 | |
| 2025-10-17 17:18:49.421 -07:00 [INF] Command:ServiceStatus Status:Success Direct:True Time:388ms | |
| 2025-10-17 17:18:53.728 -07:00 [INF] Starting Foundry Local CLI with 'service diag --logs' |
Author
justinchuby
commented
Oct 18, 2025
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment