Create valid self-signed certificates using OpenSSL


I was debugging a WebSocket connection failing with error net::ERR_INSECURE_RESPONSE, in Chrome, when I learnt that the self-signed certificate I was using was missing subject alternative names. This post brings together information I found in several different places, to create valid self-signed server certificates, using OpenSSL, that work with internet browsers such as Chrome.

valid-certificate-iis.png

To create a certificate with subject alternative names

openssl req -x509 -newkey rsa:4096 -nodes -subj '/CN=localhost' -keyout key.pem -out cert.pem -days 365 -config openssl.cnf -extensions req_ext

Additional distinguished name properties may be specified by changing the subj option

-subj "/C=US/ST=private/L=province/O=city/CN=hostname.example.com"

A minimalist openssl.cnf file that contains req_ext extension section with subjectAltName

[ req ]
distinguished_name = req_distinguished_name
req_extensions     = req_ext
[ req_distinguished_name ]
[ req_ext ]
subjectAltName = @alt_names
[alt_names]
DNS.1   = localhost
DNS.2   = example.com

Print certificate to view subject alternative names and thumbprint/fingerprint

openssl x509 -noout -text -fingerprint -in cert.pem

Create pfx from private key and certificate in pem format

openssl pkcs12 -inkey key.pem -in cert.pem -export -out key.pfx

Create crt file from certificate in pem format

openssl x509 -outform der -in cert.pem -out cert.crt

Add private key to the appropriate key store and reconfigure server application.

Add certificate file to trusted root authorities key store. Restart the browser. It should be happy with the certificate provided by the server.

On Windows, PowerShell’s New-SelfSignedCertificate command can also be used to automate self-signed certificate creation and installation.

Advertisements

Windows IoT Core application using Xamarin Forms


xamarin-forms-uwp.jpg

Screenshot of my first Windows 10 IoT Core Xamarin Forms application running on a Raspberry Pi 3. The app is the first exercise in Udemy’s Xamarin Forms course. The UWP build targeting Windows IoT Core was created following Adding a Universal Windows Platform (UWP) App. Take a look at how other native UWP IoT Core samples look and work. Also take a look at Xamarin Forms sample apps available at GitHub. You can also create an OS X application following Bringing macOS to Xamarin.Forms.

ONC RPC version 2 over TCP/IP


This post discusses message structure of the Open Network Computing (ONC) remote procedure call (RPC) version 2. The protocol is specified in IETF RFC 5531. RFC 4506 specifies the C-like data representation syntax used in RFC 5531. RFC 1833 specifies an RPC service (portmapper) used to discover RPC services provided by a host.

ONC RPC in Wireshark
Wireshark RPC dissector

The ONC RPC message structure is defined in the specification as follows

struct rpc_msg {
    unsigned int xid;               /* transaction id */
    union switch (msg_type mtype) { /* message type */
    case CALL:
        call_body cbody;
    case REPLY:
        reply_body rbody;
  } body;
};

An unsigned int, according to the XDR specification, is a 4-byte unsigned integer value in big-endian byte order. Transaction id is therefore a 4-byte value. Message type is also an unsigned int value. A value of 0 indicates a call, 1 indicates a reply.

Message fragmentation is used over a stream oriented protocol such as TCP. Transaction id is therefore preceded by a unsigned int value that indicates the size of the fragment in bytes. The most significant bit (MSB) of the unsigned int is a boolean value that, when set, indicates the last fragment of a sequence of fragments.

The call body in turn is defined as follows

struct call_body {
    unsigned int rpcvers;  /* must be equal to two (2) */
    unsigned int prog;     /* program identifier */
    unsigned int vers;     /* program version number */
    unsigned int proc;     /* remote procedure number */
    opaque_auth cred;      /* authentication credentials */
    opaque_auth verf;      /* authentication verifier */
    /* procedure-specific parameters start here */
};

enum auth_flavor {
    AUTH_NONE = 0
    /* and more */
};

struct opaque_auth {
    auth_flavor flavor;    /* authentication flavor */
    opaque body<400>;
};

If authentication flavor in use is AUTH_NONE, authentication credentials is an unsigned int value of 0, followed by another unsigned int value indicating an authentication credential body size of 0. Authentication verifier is encoded in the same manner.

A reply is defined as follows

union reply_body switch (reply_stat stat) { /* Reply status */
case MSG_ACCEPTED:
    accepted_reply areply;
case MSG_DENIED:
    rejected_reply rreply;
} reply;

struct accepted_reply {
    opaque_auth verf;
    union switch (accept_stat stat) { /* accepted status */
    case SUCCESS:
        opaque results[0];
        /*
        * procedure-specific results start here
        */
    case PROG_MISMATCH:
        struct {
            unsigned int low;
            unsigned int high;
        } mismatch_info;
    default:
        /*
        * Void.  Cases include PROG_UNAVAIL, PROC_UNAVAIL,
        * GARBAGE_ARGS, and SYSTEM_ERR.
        */
        void;
    } reply_data;
};

union rejected_reply switch (reject_stat stat) {
case RPC_MISMATCH:
    struct {
        unsigned int low;
        unsigned int high;
    } mismatch_info;
case AUTH_ERROR:
    auth_stat stat;
};

Reply status is an unsigned int value, followed by the authentication verifier encoded as explained earlier. A reply status value of 0 indicates an accepted message, which is followed by an unsigned int indicating accepted status (0 is success). A reply status of 1 indicates a rejected message, which is followed by an unsigned int indicating rejection status.

Blocks – string or opaque data, are padded with 0 to 3 residual bytes so that their length is a multiple of 4.

Capture loopback communication on Windows


Wireshark is unable to capture any loopback communication (not just loopback interface) on Windows using WinPcap. You’ll need to replace WinPcap with Npcap to be able to do that.

Uninstall WinPcap first.

Select the following options (uncheck support for raw 802.11 traffic at your discretion) during Npcap setup

Npcap Setup

To capture loopback traffic, capture on the Npcap loopback Adapter

npcap-loopback.png

 

Word to Markdown using Pandoc


Markdown has become the de-facto standard for writing software documentation. This post discusses converting Word documents to Markdown using Pandoc.

markdown.png

If you haven’t already, install Pandoc. Word documents need to be in the docx format. Legacy binary doc files are not supported by Pandoc.

Pandoc supports several flavors of Markdown (md) such as the popular GitHub flavored Markdown (GFM). To produce a standalone GFM document from docx, run

pandoc -t gfm --extract-media . -o file.md file.docx

The --extract-media option tells Pandoc to extract media to a ./media folder. All embedded media in Markdown links to files in that folder.

The generation of Markdown document is the first step. If you’re happy with the output, you can stop here, but I discuss additional changes that can make the document easier to maintain, and read using HTML renderers such as GitHub’s markup.

Markdown Editor

You’ll need a text editor to edit a md file. I use Visual Studio Code (Code) which has built-in support for editing and previewing Markdown files. I use a few additional plugins to make editing Markdown files more productive

Tables

Pandoc will render tables whose cells have a single (wrapped or not) line of text using the pipe table syntax. Column text alignment is not rendered, you’ll have to add that back manually.

Tables whose cells have complex data such as lists and multiple lines are rendered in the HTML table syntax. It is not unusual for tables with complex layouts such as merged cells to be missing columns. Review all tables carefully. I suggest simplifying complex tables in the original Word document before conversion.

Small pipe and HTML tables are relatively easy to edit by hand. Editing large tables can quickly become cumbersome. Markdown editors such as Typora provide support for visually editing piple tables. Typora does not support HTML tables.

Table of Contents

Pandora dumps the table of contents (TOC) of the original docx a line per topic. I suggest eliminating that TOC and generating a hyperlinked TOC using the capabilities of Markdown TOC plugin of Code.

The plugin can also add, update, or remove section numbering. If you have cross-references in the Word document that use section numbers, this will, at least for the moment, give you a consistent document. In the long term, I suggest avoiding section numbers, and substituting textual cross-references with intra-document hyperlinks. See TOC generated by Markdown TOC to see intra-document hyperlinking in action.

Another option is to let Pandoc number sections (-N option) and render table of contents automatically (--toc option), when rendering to HTML or PDF.

Images

Images are exported in their native format and size. They are inserted in the document using the ![caption](path) GFM syntax, or the img tag within HTML tables. Image size cannot be customized in GFM syntax, you may need to resize images to get a consistent size.

Diagrams

Pandoc is unable to render diagrams created using figures and shapes available in Word. You’ll need to recreate those by screen grabbing the output rendered by Word. You can also use mermaid.js syntax to recreate diagrams such as flowcharts and message sequence charts, embed them in the Markdown document, and render using mermaid-filter.

mermaid.js.png

GitHub doesn’t yet render mermaid diagrams, but Code is able to render them with the help of the Mermaid Preview plugin, and so is GitLab version 10.3.

Render PDF

To render a PDF using Pandoc

pandoc file.md -f gfm -F mermaid-filter -o file.pdf --toc -N

Remove the -F mermaid-filter option if your document does not have any mermaid diagrams.

I noted several problems in rendered tables. Pipe tables with long lines are not wrapped, and stretch beyond the page. HTML tables are not rendered. To fix these problems, you may need to edit the text in the tables, use a custom latex template, or use a different Markdown format with support for grid or multiline tables.

If you want to render HTML instead, change extension of file.pdf from pdf to html

pandoc file.md -f gfm -o file.html

Large Documents

Pandoc can handle large documents that have hundreds of pages. You may want to break large document into separate Markdown files for maintainability. Users may have to wait a long time to preview large document online such as at GitHub or GitLab. Previewing may fail entirely on big and complex documents.

Pandoc can render multiple Markdown files

pandoc section-1.md section-2.md -f gfm -o file.pdf --toc -N

Regular Expressions

Using regular expressions will significantly speed up your ability to do bulk search and replace operations.

Some useful regular expressions

#+\s*$     search empty headings
\s+$       search lines with trailing spaces
\b\s\s+\b  search repeated space between words
\|.*\|     search through all rows of pipe tables
section\s+(?!(\d+\.*\d*?){1,})
           search for cross-references starting with section but missing section number

.NET Core class library solution from scratch


This post documents using the dotnet command to create a class library solution from scratch. The solution builds a class library project, and a MS unit test project that tests the class library.

To create an empty solution called MySolution.sln

dotnet new sln [--force] -n MySolution

sln is just one of several templates supported by the command. To see a list, try dotnet new -l. Additional templates can be installed using dotnet new --install e.g. AvaloniaUI.

To create a new class library project

dotnet new classlib [--force] -n MyLibrary

This creates a folder called MyLibrary and a MyLibrary.csproj file in it. Any C# files in the MyLibrary folder will be compiled during build.

If MyLibrary exists, use --force to replace the exiting project file.

If your project has an AssemblyInfo.cs that contains assembly attributes, you can edit project file to exclude autogeneration of assembly attributes

<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <TargetFramework>netstandard2.0</TargetFramework>
    <GenerateAssemblyInfo>false</GenerateAssemblyInfo>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.CSharp" Version="4.4.0" />
  </ItemGroup>

</Project>

Otherwise, you’ll get errors such as

obj/Debug/netcoreapp2.0/MyLibrary.AssemblyInfo.cs(10,12): error CS0579: Duplicate 'System.Reflection.AssemblyCompanyAttribute' attribute ...

Also, note the use of Microsoft.CSharp package in the project file. That is required to use C# language features such as dynamic. Without it, you’ll get an error such as

MyClass.cs(177,50): error CS0656: Missing compiler required member 'Microsoft.CSharp.RuntimeBinder.CSharpArgumentInfo.Create'

To add package reference, head into the MyLibrary project folder and run

dotnet add MyLibrary.csproj package Microsoft.CSharp

Then, run the following to restore package(s) from nuget

dotnet restore

Head over to the solution folder. To add the class library project to the solution, and build the solution

dotnet sln [MySolution.sln] add MyLibrary/MyLibrary.csproj
dotnet build

Specifying solution name is optional if you’ve got just one solution file in a folder.

To add a new MS unit test project

dotnet new mstest [--force] -n MyLibraryTest

Head into MyLibraryTest and add a reference to MyLibrary and package references

dotnet add MyLibraryTest.csproj reference ../MyLibrary/MyLibrary.csproj
dotnet add MyLibraryTest.csproj package Microsoft.CSharp
dotnet restore

Head over to the solution folder, build, and run unit tests

dotnet build
dotnet test MyLibraryTest

That wraps up the basic usage of dotnet to create and maintain a simple .NET Core class library project.

Highlighting problems in Lua dissectors


Here’s a snippet of code from nordic_ble dissector that shows how you can highlight problems in Lua dissectors using add_expert_info

        local item  = tree:add_le(hf_nordic_ble_micok, tvb(UART_PACKET_FLAGS_INDEX, 1), micok > 0)
        if micok == 0 then
            -- MIC is bad
            item:add_expert_info(PI_CHECKSUM, PI_WARN, "MIC is bad")
            item:add_expert_info(PI_UNDECODED, PI_WARN, "Decryption failed (wrong key?)")

NOTE I recommend using add_proto_expert_info because add_expert_info is now deprecated.