by Joche Ojeda | Mar 12, 2025 | dotnet, http, netcore, netframework, network, WebServers
Last week, I was diving into Uno Platform to understand its UI paradigms. What particularly caught my attention is Uno’s ability to render a webapp using WebAssembly (WASM). Having worked with WASM apps before, I’m all too familiar with the challenges of connecting to data sources and handling persistence within these applications.
My Previous WASM Struggles
About a year ago, I faced a significant challenge: connecting a desktop WebAssembly app to an old WCF webservice. Despite having the CORS settings correctly configured (or so I thought), I simply couldn’t establish a connection from the WASM app to the server. I spent days troubleshooting both the WCF service and another ASMX service, but both attempts failed. Eventually, I had to resort to webserver proxies to achieve my goal.
This experience left me somewhat traumatized by the mere mention of “connecting WASM with an API.” However, the time came to face this challenge again during my weekend experiments.
A Pleasant Surprise with Uno Platform
This weekend, I wanted to connect a XAF REST API to an Uno Platform client. To my surprise, it turned out to be incredibly straightforward. I successfully performed this procedure twice: once with a XAF REST API and once with the API included in the Uno app template. The ease of this integration was a refreshing change from my previous struggles.
Understanding CORS and Why It Matters for WASM Apps
To understand why my previous attempts failed and my recent ones succeeded, it’s important to grasp what CORS is and why it’s crucial for WebAssembly applications.
What is CORS?
CORS (Cross-Origin Resource Sharing) is a security feature implemented by web browsers that restricts web pages from making requests to a domain different from the one that served the original web page. It’s an HTTP-header based mechanism that allows a server to indicate which origins (domains, schemes, or ports) other than its own are permitted to load resources.
The Same-Origin Policy
Browsers enforce a security restriction called the “same-origin policy” which prevents a website from one origin from requesting resources from another origin. An origin consists of:
- Protocol (HTTP, HTTPS)
- Domain name
- Port number
For example, if your website is hosted at https://myapp.com
, it cannot make AJAX requests to https://myapi.com
without the server explicitly allowing it through CORS.
Why CORS is Required for Blazor WebAssembly
Blazor WebAssembly (which uses similar principles to Uno Platform’s WASM implementation) is fundamentally different from Blazor Server in how it operates:
- Separate Deployment: Blazor WebAssembly apps are fully downloaded to the client’s browser and run entirely in the browser using WebAssembly. They’re typically hosted on a different server or domain than your API.
- Client-Side Execution: Since all code runs in the browser, when your Blazor WebAssembly app makes HTTP requests to your API, they’re treated as cross-origin requests if the API is hosted on a different domain, port, or protocol.
- Browser Security: Modern browsers block these cross-origin requests by default unless the server (your API) explicitly permits them via CORS headers.
Implementing CORS in Startup.cs
The solution to these CORS issues lies in properly configuring your server. In your Startup.cs
file, you can configure CORS as follows:
public void ConfigureServices(IServiceCollection services) {
services.AddCors(options => {
options.AddPolicy("AllowBlazorApp",
builder => {
builder.WithOrigins("https://localhost:5000") // Replace with your Blazor app's URL
.AllowAnyHeader()
.AllowAnyMethod();
});
});
// Other service configurations...
}
public void Configure(IApplicationBuilder app, IWebHostEnvironment env) {
// Other middleware configurations...
app.UseCors("AllowBlazorApp");
// Other middleware configurations...
}
Conclusion
My journey with connecting WebAssembly applications to APIs has had its ups and downs. What once seemed like an insurmountable challenge has now become much more manageable, especially with platforms like Uno that simplify the process. Understanding CORS and implementing it correctly is crucial for successful WASM-to-API communication.
If you’re working with WebAssembly applications and facing similar challenges, I hope my experience helps you avoid some of the pitfalls I encountered along the way.
About Us
YouTube
https://www.youtube.com/c/JocheOjedaXAFXAMARINC
Our sites
Let’s discuss your XAF
https://www.udemy.com/course/microsoft-ai-extensions/
Our free A.I courses on Udemy
by Joche Ojeda | Mar 11, 2025 | network
DNS and Virtual Hosting: A Personal Journey
In this article, I’m going to talk about a topic I’ve been working on lately because I’m creating a course on how to host ASP.NET Core applications on Linux. This is a trick that I learned a really long time ago.
I was talking with one of my students, Lance, who asked me when I learned all this hosting and server stuff. It’s actually a nice story.
My Early Server Adventures
When I was around 16 years old, I got a book on networking and figured out how to find free public IPs on my Internet provider’s network. A few years later, when I was 19, I got a book on Windows 2000 Server and managed to get a copy of that Windows version.
I had a great combination of resources:
- Public IPs that I was “borrowing” from my Internet provider
- A copy of Windows Server
- An extra machine (at that time, we only had one computer at home, unlike now where I have about 5 computers)
I formatted the extra computer using Windows Server 2000 and set up DNS using a program called Simple DNS. I also set up the IIS web server. Finally, for the first time in my life, I could host a domain from a computer at home.
In El Salvador, .sv
domains were free at that time—you just needed to fill out a form and you could get them for free for many years. Now they’re quite expensive, around $50, compared to normal domains.
The Magic of Virtual Hosting
What I learned was that you can host multiple websites or web applications sharing the same IP without having to change ports by using a hostname or domain name instead.
Here’s how DNS works: When you have an internet connection, it has several parts—the IP address, the public mask, the gateway, and the DNS servers. The DNS servers essentially house a simple file where they have translations: this domain (like HotCoder.com) translates to this IP address. They make IP addresses human-readable.
Once requests go to the server side, the server checks which domain name is being requested and then picks from all the websites being hosted on that server and responds accordingly.
Creating DNS records was tricky the first time. I spent a lot of time reading about it. The internet wasn’t like it is now—we didn’t have AI to help us. I had to figure it out with books, and growing up in El Salvador, we didn’t always have the newest or most accurate books available.
The Hosts File: A Local DNS
In the most basic setup, you need a record which says “this domain goes to this IP,” and then maybe a CNAME record that does something similar. That’s what DNS servers do—they maintain these translation tables.
Each computer also has its own translation table, which is a text file. In Windows, it’s called the “hosts” file. If you’ve used computers for development, you probably know that there’s an IP address reserved for localhost: 127.0.0.1. When you type “localhost” in the browser, it translates to that IP address.
This translation doesn’t require an external network request. Instead, your computer checks the hosts file, where you can set up the same domain-to-IP translations locally. This is how you can test domains without actually buying them. You can say “google.com will be forwarded to this IP address” which can be on your own computer.
A Real-World Application
I used this principle just this morning. I have an old MSI computer from 2018—still a solid machine with an i7 processor and 64GB of RAM. I reformatted it last week and set up the Hyper-V server. Inside Hyper-V, I set up an Ubuntu machine to emulate hosting, and installed a virtual hosting manager called Webmin.
I know I could do everything via command line, but why write a lot of text when you can use a user interface?
Recently, we’ve been having problems with our servers. My business partner Javier (who’s like a brother to me) mentioned that we have many test servers without clear documentation of what’s inside each one. We decided to format some of them to make them clean test servers again.
One of our servers that was failing happens to host my blog—the very one you’re reading right now! Yesterday, Javier messaged me early in the morning (7 AM for me in Europe, around 9 PM for him in America) to tell me my blog was down. There seemed to be a problem with the server that I couldn’t immediately identify.
We decided to move to a bigger server. I created a backup of the virtual server (something I’ll discuss in a different post) and moved it to the Hyper-V virtual machine on my MSI computer. I didn’t want to redirect my real IP address and DNS servers to my home computer—that would be messy and prevent access to my blog temporarily.
Instead, I modified the hosts file on my computer to point to the private internal IP of that virtual server. This allowed me to test everything locally before making any public DNS changes.
Understanding DNS: A Practical Example
Let me explain how DNS actually works with a simple example using the domain jocheojeda.com
and an IP address of 203.0.113.42
.
How DNS Resolution Works with ISP DNS Servers
When you type jocheojeda.com
in your browser, here’s what happens:

- Your browser asks your operating system to resolve
jocheojeda.com
- Your OS checks its local DNS cache, doesn’t find it, and then asks your ISP’s DNS server
- If the ISP’s DNS server doesn’t know, it asks the root DNS servers, which direct it to the appropriate Top-Level Domain (TLD) servers for
.com
- The TLD servers direct the ISP DNS to the authoritative DNS servers for
jocheojeda.com
- The authoritative DNS server responds with the A record:
jocheojeda.com -> 203.0.113.42
- Your ISP DNS server caches this information and passes it back to your computer
- Your browser can now connect directly to the web server at
203.0.113.42
DNS Records Explained
A Record (Address Record)
An A record maps a domain name directly to an IPv4 address:
jocheojeda.com. IN A 203.0.113.42
This tells DNS servers that when someone asks for jocheojeda.com
, they should be directed to the server at 203.0.113.42
.
CNAME Record (Canonical Name)
A CNAME record maps one domain name to another domain name:
www.jocheojeda.com. IN CNAME jocheojeda.com.
blog.jocheojeda.com. IN CNAME jocheojeda.com.
This means that www.jocheojeda.com
and blog.jocheojeda.com
are aliases for jocheojeda.com
. When someone visits either of these subdomains, DNS will first resolve them to jocheojeda.com
, and then resolve that to 203.0.113.42
.
Using the Windows Hosts File
Now, let’s see what happens when you use the hosts file instead:

When using the hosts file:
- Your browser asks your operating system to resolve
jocheojeda.com
- Your OS checks the hosts file first, before any external DNS servers
- It finds an entry:
192.168.1.10 jocheojeda.com
- The OS immediately returns the IP
192.168.1.10
to your browser
- Your browser connects to
192.168.1.10
instead of the actual public IP
- The external DNS servers are never consulted
The Windows hosts file is located at C:\Windows\System32\drivers\etc\hosts
. A typical entry might look like:
# For local development
192.168.1.10 jocheojeda.com
192.168.1.10 www.jocheojeda.com
192.168.1.10 api.jocheojeda.com
This is incredibly useful for:
- Testing websites locally before going live
- Testing different server configurations without changing public DNS
- Redirecting domains during development or troubleshooting
- Blocking certain websites by pointing them to 127.0.0.1
Why This Matters for Development
By modifying your hosts file, you can work on multiple websites locally, all running on the same machine but accessible via different domain names. This perfectly mimics how virtual hosting works on real servers, but without needing to change any public DNS records.
This technique saved me when my blog server was failing. I could test everything locally using my actual domain name in the browser, making sure everything was working correctly before changing any public DNS settings.
Conclusion
Understanding DNS and how to manipulate it locally via the hosts file is a powerful skill for any developer or system administrator. It allows you to test complex multi-domain setups without affecting your live environment, and can be a lifesaver when troubleshooting server issues.
In future posts, I’ll dive deeper into server virtualization and how to efficiently manage multiple web applications on a single server.
About Us
YouTube
https://www.youtube.com/c/JocheOjedaXAFXAMARINC
Our sites
Let’s discuss your XAF
https://www.udemy.com/course/microsoft-ai-extensions/
Our free A.I courses on Udemy
by Joche Ojeda | Feb 7, 2025 | Uncategorized
I recently had the privilege of conducting a training session in Cairo, Egypt, focusing on modern application development approaches. The session covered two key areas that are transforming how we build business applications: application frameworks and AI integration.
Streamlining Development with Application Frameworks
One of the highlights was demonstrating DevExpress’s eXpressApp Framework (XAF). The students were particularly impressed by how quickly we could build fully-functional Line of Business (LOB) applications. XAF’s approach eliminates much of the repetitive coding typically associated with business application development:
- Automatic CRUD operations
- Built-in security system
- Consistent UI across different platforms
- Rapid prototyping capabilities
Seamless Integration: XAF Meets Microsoft Semantic Kernel
What made this training unique was demonstrating how XAF’s capabilities extend into AI territory. We built the entire AI interface using XAF itself, showcasing how a traditional LOB framework can seamlessly incorporate advanced AI features. The audience, coming primarily from JavaScript backgrounds with Angular and React experience, was particularly impressed by how this approach simplified the integration of AI into business applications.
During the demonstrations, we explored practical implementations using Microsoft Semantic Kernel. The students were fascinated by practical demonstrations of:
- Natural language processing for document analysis
- Automated content generation for business documentation
- Intelligent decision support systems
- Context-aware data processing
Student Engagement and Outcomes
The response from the students, most of whom came from JavaScript development backgrounds, was overwhelmingly positive. As experienced frontend developers using Angular and React, they were initially skeptical about a different approach to application development. However, their enthusiasm peaked when they saw how these technologies could solve real business challenges they face daily. The combination of XAF’s rapid development capabilities and Semantic Kernel’s AI features, all integrated into a cohesive development experience, opened their eyes to new possibilities in application development.
Looking Forward
This training session in Cairo demonstrated the growing appetite for modern development approaches in the region. The intersection of efficient application frameworks and AI capabilities is proving to be a powerful combination for next-generation business applications.
And last, but not least, some pictures )))
by Joche Ojeda | Dec 1, 2024 | Blazor
My journey with Microsoft Semantic Kernel marked the beginning of a new adventure: stepping out of my comfort zone as a backend developer to create applications with user interfaces, rather than just building apps for unit and integration testing.
I naturally chose Blazor as my UI framework, and I’ll be sharing my frontend development experiences here. Sometimes it can be frustratingly difficult to accomplish seemingly simple tasks (like centering a div!), but AI assistants like GitHub Copilot have been incredibly helpful in reducing those pain points.
One of my recent challenges involved programmatically including JavaScript and CSS in Blazor applications. I prefer an automated approach rather than manually adding tags to HTML. Back in the .NET 5 era, I wrote an article about using tag helpers for this purpose, which you can find here
However, I recently discovered that my original approach no longer works. I’ve been developing several prototypes using the new DevExpress Chat component, and many of these prototypes include custom components that require JavaScript and CSS. Despite my attempts, I couldn’t get these components to work with the tag helpers, and the reason wasn’t immediately obvious. During the Thanksgiving break, I decided to investigate this issue, and I’d like to share what I found.
With the release of .NET 8, Blazor introduced a new web app template that unifies Blazor Server and WebAssembly into a single project structure. This change affects how we inject content into the document’s head section, particularly when working with Tag Helpers or components.
Understanding the Changes
In previous versions of Blazor, we typically worked with _Host.cshtml
for server-side rendering, where traditional ASP.NET Core Tag Helpers could target the <head>
element directly. The new .NET 8 Blazor Web App template uses App.razor
as the root component and introduces the <HeadOutlet>
component for managing head content.
Approach 1: Adapting Tag Helpers
If you’re migrating existing Tag Helpers or creating new ones for head content injection, you’ll need to modify them to target HeadOutlet instead of the head element:
using Microsoft.AspNetCore.Razor.TagHelpers;
namespace YourNamespace
{
[HtmlTargetElement("HeadOutlet")]
public class CustomScriptTagHelper : TagHelper
{
public override void Process(TagHelperContext context, TagHelperOutput output)
{
output.PostContent.AppendHtml(
"<script src=\"_content/YourLibrary/js/script.js\"></script>"
);
}
}
}
Remember to register your Tag Helper in _Imports.razor
:
@addTagHelper *, YourLibrary
Approach 2: Using Blazor Components (Recommended)
While adapting Tag Helpers works, Blazor offers a more idiomatic approach using components and the HeadContent component. This approach aligns better with Blazor’s component-based architecture:
@namespace YourNamespace
@implements IComponentRenderMode
<HeadContent>
<script src="_content/YourLibrary/js/script.js"></script>
</HeadContent>
To use this component in your App.razor
:
<head>
<!-- Other head elements -->
<HeadOutlet @rendermode="RenderModeForPage" />
<YourScriptComponent @rendermode="RenderModeForPage" />
</head>
Benefits of the Component Approach
- Better Integration: Components work seamlessly with Blazor’s rendering model
- Render Mode Support: Easy to control rendering based on the current render mode (Interactive Server, WebAssembly, or Auto)
- Dynamic Content: Can leverage Blazor’s full component lifecycle and state management
- Type Safety: Provides compile-time checking and better tooling support
Best Practices
- Prefer the component-based approach for new development
- Use Tag Helpers only when migrating existing code or when you need specific ASP.NET Core pipeline integration
- Always specify the
@rendermode
attribute to ensure proper rendering in different scenarios
- Place custom head content components after HeadOutlet to ensure proper ordering
Conclusion
While both approaches work in .NET 8 Blazor Web Apps, the component-based approach using HeadContent provides a more natural fit with Blazor’s architecture and offers better maintainability and flexibility. When building new applications, consider using components unless you have a specific need for Tag Helper functionality.
by Joche Ojeda | Sep 26, 2024 | A.I, DevExpress, Semantic Kernel
If you’re a Blazor developer looking to integrate AI-powered chat functionality into your applications, the new DevExpress DxAIChat component offers a turnkey solution. It’s designed to make building chat interfaces as easy as possible, with out-of-the-box support for simple chats, virtual assistants, and even Retrieval-Augmented Generation (RAG) scenarios.
The best part? You don’t have to start from scratch—DevExpress provides a range of pre-built examples, making it easy to get started and customize to your needs. Whether you’re aiming for a basic chatbot or a more complex AI assistant, this component has you covered.
To use the examples you can use any open A.I compatible service like Ollama, Open A.I and Azure OpenAI, in current devexpress example they use Azure like this
using DevExpress.AIIntegration;
...
string azureOpenAIEndpoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT");
string azureOpenAIKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string deploymentName = "YourModelDeploymentName"
...
builder.Services.AddDevExpressBlazor();
builder.Services.AddDevExpressAI((config) => {
config.RegisterChatClientOpenAIService
new AzureOpenAIClient(
new Uri(azureOpenAIEndpoint),
new AzureKeyCredential(azureOpenAIKey)
), deploymentName);
//or call the following method to use self-hosted Ollama models
//config.RegisterChatClientOllamaAIService("http://localhost:11434/api/chat", "llama3.1");
});
I tested with OpenA.I API instead of azure, so my code looks like this
string azureOpenAIEndpoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT");
string azureOpenAIKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string OpenAiKey= Environment.GetEnvironmentVariable("OpenAiTestKey");
builder.Services.AddDevExpressBlazor();
builder.Services.AddDevExpressAI((config) => {
//var client = new AzureOpenAIClient(
// new Uri(azureOpenAIEndpoint),
// new AzureKeyCredential(azureOpenAIKey));
//Open Ai models ID are a bit different than azure, Azure=gtp4o OpenAI=gpt-4o
var client = new OpenAIClient(new System.ClientModel.ApiKeyCredential(OpenAiKey));
config.RegisterChatClientOpenAIService(client, "gpt-4o");
config.RegisterOpenAIAssistants(client, "gpt-4o");
});
Notice the IDs of the models in Azure and Open A.I are different
- Azure=gtp4o
- OpenAI=gpt-4o
This are the URLs for the different example
- Chat : https://localhost:53340/
- Assistant/RAG: https://localhost:53340/assistant
- Streaming: https://localhost:53340/streaming
I’m super happy that DevExpress components are doing all the heavy lifting and boilerplate code for us, I have developed the same scenarios using semantic kernel even when there is not so much code to write you still have the challenge of develop a responsive U.I
For more information and to see the examples in action, check out the full article.