Showing posts with label C#. Show all posts
Showing posts with label C#. Show all posts

Tuesday, June 23, 2020

Server-sent event (SSE) with ASHX (.Net framework 4.6)

It took me a while to solve the Server-sent event (SSE) implementation in .Net framework 4.6.

Notes:

- This new ASHX should not access any session data. Otherwise, the user access will be "blocked". To solve this, you might have to store the user session data in a database or in memory (using static Dictionary or something like that).

- If the user open multiple tabs, the server must force closing the earlier connections. This will reduce the use of server & browser resources.

- Creating more EventSource might have side effect in the browser as it will reduce the performance in downloading the CSS, images, etc. If you need more EventSource object, you may have to create more sub-domains such as WWW1, WWW2,...

- Don't forget that any request to ASP.NET will consume a shared thread which might affect C# performance. The server connection has a limit too.

- It's best to use HTTP/2 for SSE which has better performance as compared to HTTP/1.

Here's the server side coding - add a new handler - notify_me_endlessly.ashx.


<%@ WebHandler Language="C#" Class="notify_me_endlessly" %>

using System;
using System.Web;
using System.Threading;
using System.Threading.Tasks;

public class notify_me_endlessly : IHttpHandler
{

    /// <summary>
    /// NOTES:
    /// - use HTTP/2 (with SSL/TLS) to reduce the request payload.
    /// Otherwise, the network might be flooded.
    ///
    /// </summary>
    /// <param name="context"></param>
    public void ProcessRequest(HttpContext context)
    {
        // to indicate the content type is for EventSource() .
        context.Response.ContentType = "text/event-stream";

        // hold the connection in a separate thread.
        // NOTES: this might overload the server with client connections!!

        Task t = Task.Run(() =>
        {

            // run endlessly until the client close the connection      //<<=======
            while (true)
            {

                try
                {
                    //this is IMPORTANT. The message
                    //sent to the caller must be "data: xxx".
                    context.Response.Write(string.Format("data: {0}\n\n",
                    DateTime.Now.ToString("d/MMM/yy @ HH:mm:ss.fff")));

                    System.Diagnostics.Debug.WriteLine(string.Format("{0}-sending data to browser..",
                        DateTime.Now.ToString("d/MMM/yy @ HH:mm:ss.fff")));

                    context.Response.Flush();

                    Thread.Sleep(2000);
                }
                catch //(Exception x)
                {
                    System.Diagnostics.Debug.WriteLine(string.Format("{0}-failed sending data to client",
                          DateTime.Now.ToString("d/MMM/yy @ HH:mm:ss.fff")));
                    break;
                }
            }

        });

        // wait until end of the TASK.           //<<=======
        t.Wait();

        // disconnect the caller - this is optional and the connection
        // will be closed upon exiting this proc.       
        //context.Response.Close();

        System.Diagnostics.Debug.WriteLine(string.Format("{0}-client has quit",
                                                   DateTime.Now.ToString("d/MMM/yy @ HH:mm:ss.fff")));

    }

    public bool IsReusable
    {
        get
        {
            return false;
        }
    }

}


Here's the front end which consumes the SSE:


<!DOCTYPE html>
<html>
<head>
    <title></title>
    <meta charset="utf-8" />
</head>
<body>

    <h1>
        Msg from server
    </h1>

    <input type="button" id="btnListen" value="Start Listening" onclick="startListening(); return false;" />
    <input type="button" id="btnListen" value="Stop Listening" onclick="stopListening(); return false;" />
    <ul id="my-list"></ul>


    <script type="text/javascript">
        var sse = null;
        var mylist = document.getElementById('my-list');

        function startListening() {

            //-------------------------------
            var newElement = document.createElement("li");
            newElement.textContent = "message: connecting..";
            mylist.appendChild(newElement);

            //reset before use.
            if (sse != null) {
                sse.close();
                sse = null;
            }

            sse = new EventSource('notify_me_endlessly.ashx');
          

            //-------------------------------
            sse.addEventListener("open", function (event) {
                var newElement = document.createElement("li");
                newElement.textContent = "message: connected";
                mylist.appendChild(newElement);

                console.log('onopen=', event);

            }, false);

            sse.addEventListener("error", function (event) {
                var newElement = document.createElement("li");

                if (sse.readyState == 2) {
                    newElement.textContent = "ERR message: connection closed";
                }
                else if (sse.readyState == 1) {
                    newElement.textContent = "ERR message: open";
                }
                else if (sse.readyState == 0) {
                    newElement.textContent = "ERR message: connecting..";
                }
                else {
                    newElement.textContent = "ERR message: failed-";
                }

                mylist.appendChild(newElement);

                //22.Jun.20,lhw-you might have to call sse.close() manually upon any error
                // and reconnect after x seconds using setInterval() - in this case, you
                // will have full control on the reconnecting behavior (since all
                // browser has slightly different implementation).

                console.log('onerror=', event);

            }, false);


            sse.addEventListener("message", function (e) {
                var newElement = document.createElement("li");
                newElement.textContent = "message: " + e.data;
                mylist.appendChild(newElement);

                console.log('onmessage=', event);

            }, false);

        }

        function stopListening() {

            // stop listening to the server message.
            if (sse) {

                // this proc inform the browser to stop reconnecting the server
                // in case of connection failure.
                sse.close();
                sse = null;

                var newElement = document.createElement("li");
                newElement.textContent = "message: stopped by user";
                mylist.appendChild(newElement);
               
            }

        }

    </script>

</body>
</html>

Saturday, April 21, 2018

Posting compressed data in JSON format to the ASP.NET website using WinForm

We have posted an article on how to post JSON data from WinForm to ASP.NET/ASHX in 2015 (as shown below). In the earlier article, it does not optimize the payload size and reduce the communication time.

    http://laucsharp.blogspot.my/2015/07/posting-data-in-json-format-to-aspnet.html

In order to speed up the communication between the WinForm client and ASP.NET/ASHX, we must reduce the payload size.To do that, the contents must be compressed before we upload the data and decompress upon receiving it.

There are 2 compression algorithms for HTTP/HTTPS communication: gzip and deflat.

Here is the code to compress the contents with GZip algorithm. You need this code in both WinForm and ASP.NET/ASHX.

        public static void GZip(Stream stream, string data)
        {
            byte[] b = System.Text.Encoding.UTF8.GetBytes(data);
            GZip(stream, b);
        }

        public static void GZip(Stream stream, byte[] data)
        {
            using (var zipStream = new GZipStream(stream, CompressionMode.Compress, true))
            {
                zipStream.Write(data, 0, data.Length);
            }
        }
To decompress the GZip contents. You need this code in both WinForm and ASP.NET/ASHX.

        public static string GUnZipToString(Stream stream)
        {
            byte[] b = GUnZip(stream);
            return System.Text.Encoding.UTF8.GetString(b);
        }

        public static byte[] GUnZip(Stream stream)
        {
            using (var zipStream = new GZipStream(stream, CompressionMode.Decompress, true))
            using (MemoryStream ms = new MemoryStream())
            {
                zipStream.CopyTo(ms);
                return ms.ToArray();
            }
        }
In the WinForm app, we used to write the JSON data directly to the HttpWebRequest like this:

        // write the json data into the request stream.
        using (StreamWriter writer = new StreamWriter(request.GetRequestStream()))
        {
                writer.Write(json_data);
        }

In order to reduce the payload, we should write the compressed data by doing this:

        // to indicate we accept gzip content.
        request.Headers.Add(HttpRequestHeader.AcceptEncoding, "gzip");

        // to indicate that the content is compressed using gzip algorithm.
        request.Headers.Add("Content-Encoding", "gzip");

        GZip(request.GetRequestStream(), json_data);

 In the ASP.NET/ASHX, we used to read the contents like this:

        // get the contents from the request stream
        Stream stream = context.Request.InputStream;
        using (StreamReader r = new StreamReader(stream))
        {
            s = r.ReadToEnd();
        }

Now, we have to read the contents and then decompress it. "Content-Encoding" is the indicator whether the contents is in GZip, deflat or not compress.

        if (HelperFunc.IsGZipContent(context))
        {
                 // read the compressed content.
                s = GUnZipToString(context.Request.InputStream);          
        }
         else
        {
                // get the contents from the request stream
                Stream stream = context.Request.InputStream;
                using (StreamReader r = new StreamReader(stream))
                {
                        s = r.ReadToEnd();
                }
        }

Here is the function to check whether the contents is compressed in GZip format. Please take note that the sample codes in this article did not support "deflat" algorithm.

        public static bool IsGZipContent(System.Web.HttpContext ctx)
        {
            string enc = ctx.Request.Headers["Content-Encoding"];

            if (string.IsNullOrEmpty(enc))
            {
                return false;
            }

            return  enc.ToLower().Contains("gzip");
        }

You will find more information about the static and dynamic compression in IIS:

   https://docs.microsoft.com/en-us/iis/configuration/system.webserver/httpcompression/

To find out why we need to compress the data, please refers to the following link:

   https://developer.mozilla.org/en-US/docs/Web/HTTP/Compression


Wednesday, October 4, 2017

Cross-Origin Request Blocked (CORS)

To speed up the development and future upgrade, we split the huge application into multiple AJAX services. Each AJAX service in running in it's own application pool and it can be run on different server. The design works perfectly. But, when you want to consume the AJAX services through the browser, you bang your head: "Cross-Origin Request Blocked".

This is the error message that appeared in the Firefox:

Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://localhost/schedule/q?code=tx&ts=1507099862873. (Reason: CORS header ‘Access-Control-Allow-Origin’ does not match ‘(null)’).

Google Chrome returned an error message that is slightly different:

Failed to load http://localhost/schedule/q?code=tx&ts=1507099946004: The 'Access-Control-Allow-Origin' header contains multiple values '*, *', but only one is allowed. Origin 'http://localhost:56269' is therefore not allowed access.

Now, if you are googling for the solution, you will end up with add the following settings in the web.config.

  <httpProtocol>
    <customHeaders>
      <add name="Access-Control-Allow-Origin" value="*" />
      <add name="Access-Control-Allow-Headers" value="Content-Type" />
      <add name="Access-Control-Allow-Methods" value="GET, POST, PUT, DELETE, OPTIONS" />
    </customHeaders>
  </httpProtocol>

But, the wild card origin is no longer supported. You ended up with adding the specific origin.

  <httpProtocol>
    <customHeaders>
      <add name="Access-Control-Allow-Origin" value="http://localhost:56292" />
      <add name="Access-Control-Allow-Headers" value="Content-Type" />
      <add name="Access-Control-Allow-Methods" value="GET, POST, PUT, DELETE, OPTIONS" />
    </customHeaders>
  </httpProtocol>

Imagine that you are hosting your AJAX services in multiple servers with different sub-domains.... the above solution will not work. This is because you are not allowed adding more than one domain name to "Access-Control-Allow-Origin".

To solve the problem, we need to handle the OPTIONS verb by adding the following settings in the web.config:

  <system.webServer>
    <handlers>
      <add verb="OPTIONS" name="check_opt" path="*" type="ajaxLib.CORS_OPTIONS" />
    </handlers>
  </system.webServer>

And below is the simplified code that allows CORS:

namespace ajaxLib {
public class CORS_OPTIONS : IHttpHandler
{
  public void ProcessRequest(HttpContext context)
  {
        if (context.Request.HttpMethod.ToUpper() == "OPTIONS") {
           string s = context.Request.Headers["Origin"];

           if (!string.IsNullOrWhiteSpace(s))
           {
             context.Response.AppendHeader("Access-Control-Allow-Origin", s);
             context.Response.AppendHeader("Access-Control-Allow-Headers", "Content-Type");
             context.Response.AppendHeader("Access-Control-Allow-Methods", "GET, POST, OPTIONS");
           }
    }
  }

  public bool IsReusable { get { return false; }}

}}

Two possibilities if you want to use the above code in the live environment,

1. If your service allows public access without any restriction, skip checking the Origin value.
2. If your service allows specific domain to access, you must check the Origin value before return it to the caller.



Saturday, July 22, 2017

Decoupling the code with "notification" design

In C#, there are many different ways to decouple the program and the most common ways of doing so is to use one of the following:
  • event - it's multicasting. You listen to what you need.
  • interface  - it's dynamic. You can incorporate the interface into any of your class.
  • callback using Action<T> or Func<T> - you handle what you are interested.
Other than the above, there is another way to decouple the code - using "notification messages". For example, the lowest level WndProc (in WinForm) which processes all Windows messages. Let's make use of this strategy in our C# program.

The core of this strategy - the publisher & subscriber + multi-threaded object. Let's call it NotificationServer. This core object allows the publisher to send the notification messages into the message pool. It also allows the subscriber to subscribe and listen to the notification messages that they are interested. You will find tons of example on how to implement the publisher & subscriber + multi-threaded in the Internet.


The notification message object contains the following properties:
  • message ID - the ID that allows the subscriber to identify the purpose of it.
  • session ID - it's GUID type and it's used in conjunction with broadcast flag.
  • data - it's an object to be passing around.
  • broadcast - it's a flag which tells the NotificationServer whether it should send the message to a specific subscriber or all subscribers. This can be very useful if you are implementing a TCP NotificationServer.
  • should feedback - it's a flag that indicate whether it waits for the NotificationServer's respond. This can be very useful if you are implementing a TCP NotificationServer.
The NotificationServer design
  • Embedded NotificationServer- the implementation is to have a publisher & subscriber + multi-threaded object.
  • The fun does not stop here - you can embed the NotificationServer class into a TCP server class and all communications are done through TCP communication. In this case, you will have a TCP NotificationServer which is able to run as a Windows Service. The publisher and subscribers can be any program, ASP.Net web page or another Windows Service. The publisher and subscribers could be running from the same computer or different computer.  
In our TCP NotificationServer implementation, the notification message is serialized into JSON format. We chose JSON format because we are reusing the TCP NotificationServer in various projects.

Tuesday, October 4, 2016

Using partial HTML UI + ASHX to speed up page loading

Just to share with everyone that I developed many LOB app (line of business application) and I'm still developing new LOB for my clients. LOB is very different from blog engine, corporate websites and static websites. In LOB, we can happily ignore the contents to be "readable" by SEO because some of the contents were loaded by AJAX.

As per my last blog dated 26th-June-2016, I mentioned the new strategy: JQuery + AJAX + partial HTML UI. Now, the question is how is partial HTML UI that can help out is speeding up the page loading? The answer is simple, we need to rely on the browser cache by checking the "If-Modified-Since" flag in the request header. Then, responding either status code 304 (resource has not been modified) or returning the partial HTML to the browser.

You will find tons of references if you are searching for "asp.net If-Modified-Since". Below is one of the reference that I found:

  http://madskristensen.net/post/use-if-modified-since-header-in-aspnet

The down side of this strategy is that the user will feel a bit slower on the first request to load the full page. But, the subsequent page loading or if the user is requesting for the same page, then, the time taken will be shorter. For example, we want to develop a page for user to key in the sales invoice and it allows the user from choosing the item from the list. The sales invoice is stored in a HTML file (not ASPX) and the item list HTML design is stored in another HTML file (where this item list will be reused by supplier invoice).

One of the advantage using this strategy is that it allows all these partial HTML file to be hosted in CDN (Content Delivery Network). Then, the whole LOB app will be loaded faster than using only one ASPX which could be crazily huge and hard to reuse some of the HTML design.

Note: "partial HTML UI" can be refer as "template" and it does not contains the HEAD and BODY tags. It just contains some DIV-s which eases the web designer to design and test in the browser. You don't need a programmer to start full coding but just some simple JQuery and AJAX to complete the demo.


Sunday, June 26, 2016

System design with ASHX web services (JQuery + AJAX + partial HTML UI)

Recently, we are working on a few new systems with ASP.NET. In the past, we are using Page (ASPX) + ScriptManager and we are facing some limitation in the system design which includes the following:
  • The system does not allow the user to add a new "item" into the drop down list on the fly.
  • The drop down list contains a few hundred items and we need to incorporate the 'search' functionality or paging to avoid all items to be loaded in one shot.
  • The data submitted to server failed to meet the validation process and causing the entire page sent back to the client (which travel from client to the server and then back to the client).
To solve these types of problem, we need to have a new design to the foundation of the system.
  • We must use JQuery + AJAX + partial HTML UI design so that it allows the user from adding "item" to the drop down list on the fly. The partial HTML UI will appear on the screen as a popup for user adding new item. After the user has submitted the new item to the server, validated OK and it will be added to the drop down list on the fly (with JQuery) without reloading the page or navigating to another page.
  • The drop down list that contains lots of item will be replace by a textbox. Upon the user clicking on this textbox, the items will be loaded from the server (with AJAX calls) and then display in the popup (with JQuery + partial HTML UI). To improve the user's experience, you may consider the auto-complete feature upon the user typing the text or divide the data using the paging concept.
  • We must do more AJAX calls for submitting the user input to be validated by the system. In case of any failure such as validation failed, the server should returns the error message only. This avoids the entire page to be re-created in the server and then send to the browser.
With the new design, the system becomes more responsive and lesser network traffic. But, we still have a problem on how to handle the AJAX call. Are we going to have one ASHX to handle one process (that will end up on lots of ASHX)? Or are we going to have only one ASHX entry point that handles all the requests?

To solve this problem, here is the list of frequent use "web service" to be implemented with Handler (ASHX):
  • ~/q  - this web service handles the "query" that includes CRUD (create, return, update & delete) processes, daily process, ad-hoc process and all other business processes. The report request is another area which you may consider to put into this service.
  • ~/t - this web service returns the "HTML template" (partial HTML UI design) to be injected to the current web page. By creating the partial HTML UI file, it allows the designer to work on the layout without have to go through all the JQuery + the DOM element generation (i.e., low level stuffs). Modifying the DOM elements using JQquery is very time consuming and it requires a more expensive Javascript programmer. But, we have done it with a cheaper costing. The nice partial HTML UI has been done by the designer and the programmer requires to populate the JSON data into the appropriate placeholder.
  • ~/f - this web service handles all the file services that include upload, download/view. For example, when the user calls out the "contact list", it shows the profile photo of the contact. This profile photo IMG SRC is "~/f?contact_id=124567" where "contact_id" is the primary key value of the contact. It does not point to any physical file name. The "f" service will do all the necessary at the server side and returns the binary of the photo (an image file).
To setup the above shortcut, you have to create mapped URL in web.config. For example,

  <system.web>
    <urlMappings>
      <add url="~/q" mappedUrl="~/myWebService/q.ashx"/>
    </urlMappings>
  </system.web>

The design these web services:
  • Client is making a request to the web service:
    • "code" - the command code, object type or process code to be executed.
    • "action" - this includes CRUD and other actions (such as run daily job, run hour job).
    • "query parameters" - the query parameters are wrapped into a JSON object. For example, the client is requesting the customers who is owing more than $10,000 for more than 90 days.
  • Responding to the client:
    • "msg" - the message to the client. "ok" to indicate the query has successfully executed. Otherwise, it contains the error message.
    • "list" - the list of JSON formatted data requested by the client. This information is optional.
Some of you might be thinking why we are not using REST for these web services. The answer is simple: we don't need to given definition to the "method". Such as PUT or POST method. What if the user wants to execute an ad-hoc process (POST or custom method)?



Thursday, January 14, 2016

WebSocket

Implementing WebSocket in ASP.Net is quite easy. You need 2 components:

1. The ASHX that handle the web socket communcation.
2. The client side Javascript which sends message to the server and waiting for server message.

The WSHandler.ashx page resides in "WSChat" folder

<%@ WebHandler Language="C#" Class="WSHandler" %>
using System;
using System.Web;
using System.Threading;
using System.Threading.Tasks;
using System.Web.WebSockets;
using System.Net.WebSockets;
using System.Text;
using System.Collections.Generic;
using System.Linq;

//22.Dec.15,lhw-
public class WSHandler : IHttpHandler
{

    public void ProcessRequest(HttpContext context)
    {
        if (context.IsWebSocketRequest)
        {
            context.AcceptWebSocketRequest(ProcessWSChat);
        }
    }

    public bool IsReusable { get { return false; } }


    private async Task ProcessWSChat(AspNetWebSocketContext context)
    {
        WebSocket socket = context.WebSocket;

        //<<=======
        MyConnection cn = new MyConnection(socket);
        _conn_list.Add(cn);
        //<<=======

        while (true)
        {
            ArraySegment<byte> buffer = new ArraySegment<byte>(new byte[1024]);

            WebSocketReceiveResult result = await socket.ReceiveAsync(buffer, CancellationToken.None);

            //------------------------------------------------------------------------------
            if (socket.State == WebSocketState.Open)
            {
                string userMessage = Encoding.UTF8.GetString(buffer.Array, 0, result.Count);


                if (userMessage.StartsWith("helo from"))
                {
                    // user has signed in with his ID.
                    cn.uid = userMessage.Substring("helo from".Length, userMessage.Length - "helo from".Length).TrimStart();
                }
                else if (userMessage.ToLower().StartsWith("b:"))
                {
                    // broadcast the message.
                    userMessage = "You sent: " + userMessage + " at " + DateTime.Now.ToLongTimeString();
                    await Broadcast(userMessage);
                }
                else
                {
                    // echo the message
                    userMessage = "You sent: " + userMessage + " at " + DateTime.Now.ToLongTimeString();
                    buffer = new ArraySegment<byte>(Encoding.UTF8.GetBytes(userMessage));
                    await socket.SendAsync(buffer, WebSocketMessageType.Text, true, CancellationToken.None);
                }
            }
            //------------------------------------------------------------------------------
            else if (socket.State == WebSocketState.CloseReceived)
            {
                // remove current connection from the memory
                var v = _conn_list.Where(n => n.sess_id == cn.sess_id).FirstOrDefault();
                if (v != null)
                {
                    _conn_list.Remove(v);
                }

                // inform everyone that the current user has left the chat.
                await Broadcast(cn.uid + " has signed out");
            }
            else
            {
                break;
            }
        }
    }

    //------------------------------------------------------------------------------
    async Task Broadcast(string msg)
    {
        ArraySegment<byte> buffer = new ArraySegment<byte>(Encoding.UTF8.GetBytes(msg));

        foreach (var item in _conn_list)
        {
            await item.socket.SendAsync(buffer, WebSocketMessageType.Text, true, CancellationToken.None);
        }
    }

    //------------------------------------------------------------------------------
    public class MyConnection
    {

        public string sess_id { get; private set; }
        public DateTime connected_on { get; private set; }
        public WebSocket socket { get; set; }

        // value from the browser. The user must send 'helo from xxx' where 'xxx' is the user id.
        public string uid { get; set; }

        public List<string> chat_room_list { get; set; }

        public MyConnection(WebSocket sk)
        {
            this.sess_id = Guid.NewGuid().ToString();
            this.connected_on = DateTime.Now;
            this.chat_room_list = new List<string>();
            this.socket = sk;
        }
    }

    static List<MyConnection> _conn_list = new List<MyConnection>();

}

The client side JavaScript

<html xmlns="http://www.w3.org/1999/xhtml">
<head>
    <title>WebSocket Chat</title>
    <script type="text/javascript" src="js/jquery-1.7.js"></script>
    <script type="text/javascript">
        var ws;
        $().ready(function () {
            $("#btnConnect").click(function () {
                $("#spanStatus").text("connecting");
                ws = new WebSocket("ws://" + window.location.hostname +
                    ":64495/WSChat/WSHandler.ashx");
                ws.onopen = function () {
                    $("#spanStatus").text("connected");
                    post_msg();
                };
                ws.onmessage = function (evt) {
                    $("#spanStatus").text(evt.data);
                };
                ws.onerror = function (evt) {
                    $("#spanStatus").text(evt.message);
                };
                ws.onclose = function () {
                    $("#spanStatus").text("disconnected");
                };
            });
            $("#btnSend").click(function () {
                if (ws.readyState == WebSocket.OPEN) {
                    ws.send($("#textInput").val());
                }
                else {
                    $("#spanStatus").text("Connection is closed");
                }
            });

            $("#btnDisconnect").click(function () {
                ws.close();
            });

            var _timer = null;
            function post_msg() {
                _timer = window.setInterval(function () {
                    send_msg('helo from lau - ' + (new Date()).getSeconds().toString());
                }, 300);
            }
            function send_msg(s) {
                if (_timer != null) {
                    clearInterval(_timer);
                }

                if (ws.readyState == WebSocket.OPEN) {
                    ws.send(s);
                }
                else {
                    $("#spanStatus").text("Connection is closed");
                }
            }
        });
    </script>
</head>
<body>
    <input type="button" value="Connect" id="btnConnect" />
    <input type="button" value="Disconnect" id="btnDisconnect" /><br />
    <input type="text" id="textInput" />
    <input type="button" value="Send" id="btnSend" /><br />
    <span id="spanStatus">(display)</span>
</body>
</html>

Saturday, August 1, 2015

Failed to deserialize the object due to DLL version/name has changed

When you tried to deserialize the binary to an object but you encountered the following exception:

  BinaryFormatter.Deserialize “unable to find assembly”

Basically, it tells you that it cannot find the DLL by version + name. This is commonly issue when you change the DLL version number or move the class to another project/assembly. As a result, we need a way to tell the BinaryFormatter class what is the correct new "type" for the binary.

In the deserialization process, you need to add a line to use your custom binder:

using (MemoryStream memory = new MemoryStream(user_input))
{
    BinaryFormatter binary = new BinaryFormatter();

    //fix the deserialization error when the DLL version has been changed.
    binary.Binder = new PreMergeToMergedDeserializationBinder();

    // convert the binary to the list.
    this._data = binary.Deserialize(memory) as List<CUserDataItem>;
}

And then add the following class. I have enhanced this class and it is able to handle the generic list as well.

public sealed class PreMergeToMergedDeserializationBinder : System.Runtime.Serialization.SerializationBinder
{
    public override Type BindToType(string assemblyName, string typeName)
    {
        Type typeToDeserialize = null;

        // For each assemblyName/typeName that you want to deserialize to
        // a different type, set typeToDeserialize to the desired type.
        String exeAssembly = Assembly.GetExecutingAssembly().FullName;
      
        // The following line of code returns the type.

        // extract the 'old dll name/version'.
        string old_dll = typeName.ExtractString(',', ']');

        if (old_dll.IsNotEmpty())
        {
            // for generic list, we replace the dll name/version here.
            typeToDeserialize = Type.GetType(typeName.Replace(old_dll, exeAssembly.ToString()));
        }
        else
        {
            // for 1 single object, the 'typeName' is the class name.
            // We should return the type name with the new dll name/version.
            typeToDeserialize = Type.GetType(String.Format("{0}, {1}",
                                                typeName, exeAssembly));
        }

        System.Diagnostics.Debug.Assert(typeToDeserialize != null);

        return typeToDeserialize;
    }
}

I have an string class extension which helps to extract partial string:

public static string ExtractString(this string s,
    char start_char,
    char end_char)
{
    int i = s.IndexOf(start_char);
    if (i >= 0)
    {
        //16.Nov.2011-lhw-the 'end_char' should be search after the 'start_char'.
        int i2 = s.IndexOf(end_char,
                           i + 1);      //16.Nov.2011-lhw-missing the start pos!!

        string tmp = s.Substring(i + 1,
                                 i2 - i - 1);

        return tmp;
    }
    else
    {
        return string.Empty;
    }
}

Reference:
http://stackoverflow.com/questions/5170333/binaryformatter-deserialize-unable-to-find-assembly-after-ilmerge

Thursday, July 2, 2015

Routing to ASHX


Here is the piece of code that I found in CodeProject.com. By adding this extention method, you will be able to route the request to ASHX:

namespace System.Web.Routing
{
    public class HttpHandlerRoute : IRouteHandler
    {
        private String _virtualPath = null;
        private IHttpHandler _handler = null;

        public HttpHandlerRoute(String virtualPath)
        {
            _virtualPath = virtualPath;
        }

        public HttpHandlerRoute(IHttpHandler handler)
        {
            _handler = handler;
        }

        public IHttpHandler GetHttpHandler(RequestContext requestContext)
        {
            IHttpHandler result;
            if (_handler == null)
            {
                result = (IHttpHandler)System.Web.Compilation.BuildManager.CreateInstanceFromVirtualPath(_virtualPath, typeof(IHttpHandler));
            }
            else
            {
                result = _handler;
            }
            return result;
        }
    }

    public static class RoutingExtensions
    {
        public static void MapHttpHandlerRoute(this RouteCollection routes, string routeName, string routeUrl, string physicalFile, RouteValueDictionary defaults = null, RouteValueDictionary constraints = null)
        {
            var route = new Route(routeUrl, defaults, constraints, new HttpHandlerRoute(physicalFile));
            RouteTable.Routes.Add(routeName, route);
        }

        public static void MapHttpHandlerRoute(this RouteCollection routes, string routeName, string routeUrl, IHttpHandler handler, RouteValueDictionary defaults = null, RouteValueDictionary constraints = null)
        {
            var route = new Route(routeUrl, defaults, constraints, new HttpHandlerRoute(handler));
            RouteTable.Routes.Add(routeName, route);
        }
    }
}

To access the routing data in ASHX, you need to do this:


            var o = context.Request.RequestContext.RouteData.Values["id"];
            if (o != null)
            {
                q = o.ToString();
            }

Reference:
http://www.codeproject.com/Tips/272258/ASP-net-HttpHandler-Routing-Support

Posting data in JSON format to the ASP.NET website using WinForm

Previously, we have shown how to post JSON data using JQuery to ASP.NET.

  http://laucsharp.blogspot.com/2013/03/posting-data-in-json-format-to-aspnet.html

Now, we are going to post JSON data using WinForm:

This is our business object which will reside at the server and client.

    public class Class1
    {
        public string code { get; set; }
        public string name { get; set; }

        public override string ToString()
        {
            return string.Format("code={0}, name={1}",
                this.code,
                this.name);
        }
    }

In the WinForm client program, when the user hit Button1 after keyed in the client code and name, the data will be submitted to the server:

        private void button1_Click(object sender, EventArgs e)
        {
            // store the user input into the business object.
            Class1 data = new Class1();
            data.code = this.client_code.Text;
            data.name = this.client_name.Text;

            // convert it into json format.
            JavaScriptSerializer js = new JavaScriptSerializer();
            string json_data = js.Serialize(data);

            // create the web request.
            HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create("http://localhost:57655/dataGateway.ashx");
            request.ContentType = "application/json;";           
            request.Method = "POST";

            // write the json data into the request stream.
            using (StreamWriter writer = new StreamWriter(request.GetRequestStream()))
            {
                writer.Write(json_data);
            }

            // get the server response.
            using (WebResponse response = request.GetResponse())
            {
                // read the server response.
                Stream response_stream = response.GetResponseStream();
                using (StreamReader r = new StreamReader(response_stream))
                {
                    // do what ever you want with the response.
                    this.label5.Text = r.ReadToEnd();
                }
            }           
        }

Finally, at the server side, we add a Generic Handler (dataGateway.ashx) and it looks like this:

<%@ WebHandler Language="C#" Class="dataGateway" %>

using System;
using System.Web;
using System.IO;
using System.Web.Script.Serialization;

public class dataGateway : IHttpHandler
{
    public void ProcessRequest(HttpContext context)
    {
        context.Response.ContentType = "text/plain";

        string s;
       
        // get the contents from the request stream
        Stream stream = context.Request.InputStream;
        using (StreamReader r = new StreamReader(stream))
        {
            s = r.ReadToEnd();
        }

        // ensure that the content is not empty.
        if (string.IsNullOrEmpty(s) || s.Length == 0)
        {
            context.Response.Write("'data' cannot be blank");
            return;
        }

        // convert it from json format to our business object
        JavaScriptSerializer js = new JavaScriptSerializer();
        Class1 obj = js.Deserialize<Class1>(s);

        // do whatever you want
        context.Cache["data"] = obj;

        // returns the response code/status to the caller.
        context.Response.Write("ok. received the data =>" + s);
    }

    public bool IsReusable { get { return false; } }
}

Next, sending compressed data in WinForm:

    http://laucsharp.blogspot.my/2018/04/posting-compressed-data-in-json-format.html

Thursday, June 18, 2015

Excluding folders upon publishing the website

In VS2013, after you have setup the "publish" (right click on the website and choose Publish Web Site), a new configuration file (website.publishproj) will be added to the project.

To exclude the folders, you have to open this file and add the following section with the "project" section:

  <ItemGroup>
    <ExcludeFromPackageFolders Include="log;temp;">
      <FromTarget>Remove temp folders</FromTarget>
    </ExcludeFromPackageFolders>
  </ItemGroup>

In the "Include" attribute, it contains the folder to be removed when publishing the website. The above example excluding "log" and "temp" folders.


Sunday, June 14, 2015

Asp.net WebForm + Routing

To add the routing support in the WebForm, you need to add "System.Web.Routing" reference to your website.

Then, in the global.asax file, add the following codes:

void Application_Start(object sender, EventArgs e)
    {
        // Code that runs on application startup
        RegisterRoute(System.Web.Routing.RouteTable.Routes);
    }
   
    void RegisterRoute(System.Web.Routing.RouteCollection r)
    {
        // shows all customer
        r.MapPageRoute("allcust",
                        "customer",
                        "~/customer.aspx");
       
        // shows 1 customer profile
        r.MapPageRoute("cust",
                        "customer/{code}",
                        "~/customer.aspx");

        // shows the doc for the customer
        r.MapPageRoute("cust_doc",
                        "customer/{code}/{doc_no}",
                        "~/customer.aspx");

        // this allows any sub-level of parameters and the page requires to parse 'Page.RouteData.Values["queryvalues"]' (only 1 value).
        // The caller can call this route by the following url:
        //      /cust
        //      /cust/a001
        //      /cust/a001/inv123456
        //
        r.MapPageRoute("cust_query",
                        "cust/{*queryvalues}",
                        "~/customer.aspx");
       
    }

Finally, add all the aspx page that you need to display the information. In our example, we have only one page that is customer.aspx. You can get the parameter values in "Page.RouteData". Note: the last route that has been setup in the above was not shown in the following codes.

    protected void Page_Load(object sender, EventArgs e)
    {
        if (this.RouteData.Values.Count == 0)
        {
            this.lbl1.Text = "cust list";
        }
        else
        {
            if (this.RouteData.Values.Count == 1)
            {
                this.lbl1.Text = "customer => " + this.RouteData.Values["code"];
            }
            else
            {
                this.lbl1.Text = "customer => " + this.RouteData.Values["code"]
                    +",doc_no=" + this.RouteData.Values["doc_no"];
            }

            foreach (var item in this.RouteData.Values.Keys)
            {
                System.Diagnostics.Debug.WriteLine("param=" + item
                     + ",value=" + this.RouteData.Values[item]);
            }
        }
    }

If you need to set the navigate URL in the hyperlink control, this can be done easily as shown below:

    protected void Page_Load(object sender, EventArgs e)
    {
        // manually setup the URL based on the parameters.
        this.all_cust_link.NavigateUrl = RouteTable.Routes.GetVirtualPath(null, "allcust", null).VirtualPath;

        RouteValueDictionary d;
        d = new RouteValueDictionary();
        d.Add("code", "a001");

        this.one_cust_link.NavigateUrl = RouteTable.Routes.GetVirtualPath(null, "cust", d).VirtualPath;

        d = new RouteValueDictionary();
        d.Add("code", "a001");
        d.Add("doc_no", "inv123456");
        this.one_doc_link.NavigateUrl = RouteTable.Routes.GetVirtualPath(null, "cust_doc", d).VirtualPath;

    }


Thursday, April 23, 2015

ASP.NET UpdatePanel is not working in Google Chrome

I received a complaint from one of my user who said that the data grid not refreshing properly in Google Chrome.

Guess what..? I turned on the developer console in Chrome and found out that there is an error in loading some java script. After some research, I found the script below that fixed the problem:

Sys.Browser.WebKit = {};
if (navigator.userAgent.indexOf('WebKit/') > -1) {
            Sys.Browser.agent = Sys.Browser.WebKit;
            Sys.Browser.version = parseFloat(navigator.userAgent.match(/WebKit\/(\d+(\.\d+)?)/)[1]);
            Sys.Browser.name = 'WebKit';
}

Friday, October 3, 2014

Queue processing design

Queue concept is straight forward - append new item at the tail and dequeue from the head. It does not tell you how to process the requests in the queue.

You can enqueue new requests with multiple threads and processing the requests with 1 worker thread or multiple threads. Then, here comes with the processing designs:

Sequential (synchronous) processing the requests

With only 1 worker thread processing the requests in the queue, this is going to be slower as compared to the multiple worker threads but it has advantages. The advantage of sequential processing is when it comes to file I/O operation or network I/O operation. Just imagine that the exit has one door and only one person is able to pass through at one time.

Parallel (asynchronous) processing the requests

Thanks to multi-core and multi-CPU. With more than 1 worker thread processing the requests, it reduces the client wait time (on the response) and increases the server throughput.

Multi-thread has nothing to do with the "reducing the processing time for a request"! The "processing time" is depending on your codes and the how much the codes can be optimized (in the development time and also run time).

If there is a long run request and the request is handling by a worker thread (in a thread-pool), then, that worker thread is blocked. It won't be able to handle other request until it finished the current request. And if you have many of long running requests, then, all threads in the thread pool might not be able to reduce the response time. The server is now consider as "busy" even though the client is able to keep sending their requests into the queue.

Can we have unlimited threads in the pool..?!

The answer is no. If you have instantiated too many threads without using thread pool, then, the memory will be depleted... and the program will crash. 

What happen to the requests in the queue if my program crash?

All requests will be disappear from the memory. Your program may be designed with auto-restart in case it crashed. But, you won't be able to recover the requests which was stored in the memory.

In case you need something better than storing the requests in memory..

1. Each thread in the thread pool must maintain individual processing statistics and these statistics will be updating to the database hourly. With this information, the system administrator will be able to tell how many worker threads were blocked, in which time zone and which request blocks it. Then, he will be able to justify whether to upgrade or replace the server.

The statistics to be keep track by each thread in the thread pool should include:

- Number of requests that has been processed
- Total processed time (ms)
- Highest processed time (ms)
- Request name for highest processed time

- Peak hour - 24 time zone in a day (just get the current hour value will do)

2. Using database as queue storage instead of memory - the main advantage is that the server will be able continue from where it's left before crashing. With this approach, there will be some overhead in storing the requests into database. But, those overhead is justifiable with the crash-proof design.

Another advantage of storing the requests into database is that the secondary server will be kick-in if all the worker threads in the thread pool (in the primary server) are blocked. The secondary server will be notify by the primary server (through WCF or socket). Then, the secondary server will query the database for the pending requests and process it accordingly.

Two ways to handle the respond to the client

- The result will be stored in the database and then notify the primary server. Then, the primary server will query the result and responds to the client.

- OR the result will be send to the primary server (through WCF or socket) and then responds to the client.


Saturday, August 2, 2014

Queuing the data/request

Non-blocking... hey just call me whenever the result is ready.. I don't want to wait here.

Whenever we talk about blocking (synchronous) or non-blocking (asynchronous), obviously it is related to the multi-threading and queue.

Thread is created to execute some codes without blocking the main thread (using the time slice of a CPU or run the thread in a separate CPU core). This improves the responsiveness of the program (i.e., the main thread).

In the server program, it needs to serve many clients concurrently. So, the server program have to queue the requests and let the client go (i.e., without asking them to wait for the result). Queue is always first in first out (i.e., FIFO).
  • The worker thread will always process the request that come first. The result will be send to the client through callback (please refers to the Push Design article earlier).
  • On the other hand, the client request  will be appended at the end of the queue. And the client will wait for the result through callback.
With this processing order, all requests will be served based on its' "request time".

Things will become more complicated with the following design:
  • Thread pool (i.e, there are multiple worker threads that handle the request) - you can find many open source C# thread pool libraries which smartly create more threads when there are many requests and reduces the number of threads when the number of requests reduce. Some will even create threads based on the number of CPU core.
  • Request can be prioritized - with prioritization, it allows the urgent request to jump queue even though it came in late. You can imagine that the server program has multiple queues (one for each priority) and the highest priority will have threads to standby to serve the urgent request.
OK. Let's continue the previous Push Design topic.

You need "command queue" and "callback data queue"

With WCF, the socket programming becomes easier. But, without the "non-blocking" design in mind, the communication process will make the server or the client unresponsive. The unresponsiveness will be severe when the number of concurrent clients increase and the amount of data to be transmit become larger. To alleviate this problem, you need to implement thread pool and queue into both server program and client program.

In the client program, you need this:
  • Command queue - when the user click "submit request to the server", the command (written in WCF) should go into a "command queue" (this queue is residing at the client site). Then, one worker thread will send this request to the server. Since we are designing "non-blocking" program, the worker thread should not wait for response from the server. It will continue to send the next request/command to the server until the queue is empty. From the user experience, the user will feel that clicking the submit button does not freeze the screen (this is something good). For example, the user is using Internet browser to open multiple tabs and each tab is requesting different web pages.
  • Callback data queue - once server completed and the request and sends the result back to the client (through callback), the client should store the result into a queue. This is the second queue that you need aside from the command queue. Upon receiving the response, the worker thread should dispatch the result to the respective "caller" (which could be a screen) until the queue is empty.

In the server program you need this:
  • Command queue - when the client program sends a request, it will be appended to the queue (this queue is residing at the server site). The client should not wait for the result or else they could be blocking the server (this could end up with resource contention problem where multiple clients are competing for the same resource). A worker thread will pickup this request and do all the necessary process. Upon completion, it will append the result to the callback data queue and let another worker thread to dispatch the result to the client.
  • Callback data queue - same as the client program, this is another queue aside from the command queue. The purpose of this queue is to let the command worker thread to process the rest of the requests immediately after one request has been completed. Making a callback from to the client might face latency problem (i.e, not really "realtime" but some unexpected network traffic out there). With a thread that only handles the callback, even though the network connection is slow it won't affect the command work thread (the processing time will be maintaining). Now, the callback worker thread will take it's own sweet time to send the result to the client. No worry about the process time. No worry about the limited command worker thread in the pool.
The command queue and callback data queue should work in conjunction with thread pool. You may have one thread pool to take care one queue or one thread pool that take cares all the queues.


Friday, July 4, 2014

Push design


Everything in the communication has time limit

In push design, one of the bigger challenge is to make sure that both server and client side are storing the data/request into a "queue" and then to be handle when some threads are free to process the data/request. By using the "queue", the client/server can end the current method call after the data/request has been queued for later processing.

In the WCF context, you have two types of design to process the data/request:

1. using "function" design (works like "DateTime.Now" which returns the value immediately) - for example, the client sends "current time" command to the server and expecting the server responding (almost) immediately at the end of the calls.

2. using "callback" design - for example, the client sends "current time" command to the server and does not wait for the server respond. Instead, the server will send the current time through callback.

Both designs have pros and cons and it all depends on your need.

- The "callback" design allows the server to take it's precious time to prepare the necessary data for the client. In case the server is busy or the resources were blocked, it just have to wait until those resources were freed up. It also allows the server to schedule the process later. Upon completion, the server will make callback to the client. This is acceptable if it is not a real-time system.

- The "function" design - the client is always waiting for the result and it needs it now. By using this design, your server is running on deadlock risk (i.e., competing for the resources and locked the resource that other client is asking for). Since all the clients want it now, the deadlock will occur as soon as the same resources were requested by multiple clients. Of course, the deadlock can be avoided with proper locking mechanism.

Even with WCF, the connection will get disconnected

This is not true if you have full control over the server and client. WCF allows the system administrator to tune the "keep alive" time limit. Just in case you don't have the full server access, you need to do something to keep the connection alive.


This can be done by sending NOOP command (i.e., a dummy command that does not perform any action) from client to the server - this will keep the connection alive. In case the connection has broken, you just need to re-establish it.

To send the NOOP command repeatedly, you just need a System.Threading.Timer object which queuing the NOOP command in every 1 or 2 seconds.

Tuesday, April 15, 2014

How to get the updated data - Push VS Pull

In a client and server environment or cloud environment, your program often requires to monitor the updates on the server. There are two ways to catch the updates: either using a Push or Pull design.

  • Push design - with this design, when there is an updates happened, the server will notify the client program. The design will be more complicated (in both client program and server program) as compared to the Pull design.
  • Pull design - with this design, the client program will continuously query the server for the updates. Of course, this design is very simple but it comes with a bigger costs (in terms of bandwidth and server processing power) when the number of connections grow.
In order to serve more client connections and reducing the bandwidth consumption, you will have to implement the Push design.

Using socket or web socket to implement the push design:
  • This is one of the basic element to implement the push design. So, you must learn how to write socket program. With .Net, you may use WCF (Windows Communication Foundation) to implement this idea but you still need to learn the technical details of what is all about socket and how it works with different configuration.
  • Imagine that user A key in a new blog post throught a website and then all the followers will be notified within a few seconds. In this case, the server will send a signal (either using TCP or UDP) to the "online users" (i.e., the user must run a client program and sitting down in the computer to wait for the incoming signal). The preferred way to send the signal is using UDP which you can find lots of information about TCP vs UDP.
  • Other than how to send the signal, one of the challenge is the how secure is your data when it is travelling from the server to the client or vice versa. Of course, with WCF, you have the choice of choosing different configuration. In other platform (other than .Net), you will might have to implement the security over the socket communication using SSL/TLS. Just to share with you that you can implement SSL/TLS in Python easily.
  • I guess we are quite lucky with the modern programming languages because most of the modern programming langauges able to support asynchronous design with a few keywords changes. We need to learn about async programming as well or otherwise the server program will not be able to scale-up.


Wednesday, November 27, 2013

Errors that appear in the WCF client app...

I was setting a new server to host my WCF app and I hit the wall with the following error messages. I spent 2 days in solving this problem. I guess, many people is wondering how to resolve these interesting errors.

  • The server has rejected the client credentials.
  • The socket connection was aborted. This could be caused by an error processing your message or a receive timeout being exceeded by the remote host, or an underlying network resource issue. Local socket timeout was '00:00:59.9687500'.
  • The communication object, System.ServiceModel.ServiceHost, cannot be used for communication because it is in the Faulted state.

I have developed a WCF solution in WinForm which allows you to comment out certain settings in the server and client so that you can reproduce the above mentioned error message.

  https://github.com/lauhw/WcfWinForm

Please beware that if you are running the WCF server and client in the same computer, the above error might not appear until you run them in a separate computer.

In a nutshell:
  1. Both server and client config file must have the same settings in the "Binding" record. This includes the "security mode".
  2. The "identity" settings must be the same in both server and client config file.
  3. In the live environment, make sure you remove the "MEX" endpoint unless you want anyone to be able to query the interface.
  4. Don't forget to open the port in the firewall.
In case you are hitting the same error mentioned above, the fastest way to solve this problem is to run the test WCF solution (as mentioned above). Make sure that the test solution works. After that, compare the differences in app.config between the test solution and your solution.

Hope you don't have to spend 2 days to solve the configuration problem. ;)

Thursday, April 25, 2013

Session is not missing in the Handler

I still remember the first time that I'm developing with generic handler (.ashx) and found out that the session is missing. This is because the handler implements only the basic feature (as compared to Page class).

In order for the AJAX call accessing the session information, your handler must implement the following interface:

    System.Web.SessionState.IRequiresSessionState

OR

    System.Web.SessionState.IReadOnlySessionState

That's all you need.

Saturday, March 2, 2013

Posting data in JSON format to the ASP.NET website

When you are posting data using JQuery AJAX, you may call $.post() method to submit the data. We normally post the "data" by specifying the parameter name and value into the "data" parameter in the $.post() method. But, it will be very tedious to add new field to the data parameter when you already have lots of fields. To ease the coding maintenance, you may post the data in JSON format. This is quite simple by instantiating a new Object and set the property with values.

For example, I wrote this script in a HTML file:

    <script src="../Scripts/jquery-1.7.js" type="text/javascript"></script>
    <script type="text/javascript">
        function postJson() {
            var m = {
                myname: 'abc',
                myage: 10
                // add more properties here
            };
            var url = 'post_json.aspx';
            $.post(url, JSON.stringify(m), function (d) {
                if (d && d == 'ok') {
                    alert('ok');
                }
                else {
                    alert('failed');
                }
            });
        }
    </script>

Below is the "post_json.aspx" which you may use Handler (.ashx) to avoid the ASP.NET page life cycle.

using System.Web.Script.Serialization;

    public partial class post_json_post_json : System.Web.UI.Page
    {
        protected void Page_Load(object sender, EventArgs e)
        {      
            string json_text = this.Request.Form[0];
            JavaScriptSerializer js = new JavaScriptSerializer();
            CInfo my_obj = (CInfo)js.Deserialize(s, typeof(CInfo));

            this.Response.Write("ok");
            this.Response.End();
        }
    }

This is the data class that I'm using in this demo:

    public class CInfo
    {
        public string myname { get; set; }
        public int myage { get; set; }
    }

By using this technique, you will be achieving the following:
  • Easy to maintain the JavaScript coding by replacing the data parameter in the $.post() with a stringify data in JSON format.
  • Easy to maintain the code in ASP.NET - this is because you just need to declare a class which contains all the properties which is same as the data in JSON format. You may declare a property with other class type as well and also supporting object array.