Posts for addison.im blog http://www.addison.im/blog Latest posts for addison.im blog en-us All of Me Last weekend I recorded a take of the standard All of Me. Since I have been just using my phone to record stuff the backing track was hard to hear originally, however my friend Jeremy worked some audio magic and was able to make it sound great.

]]>
post/42 Fri, 14 Apr 2017 15:04:30 +0000
PgSqlLib - A Class Library for PostgreSql DAL in Asp.NET Core ASP.NET Core is a lean and composable framework for building web and cloud applications. One of the awesome aspects of the framework is that is runs on Linux and OSX in addition to Windows.

A few months ago I started building a few applications to get a feel for the framework and decided to go with PostgreSQL for the database since I was planning on hosting the apps on a server with an Ubuntu operating system. Although there is support for Entity Framework with Asp.NET Core I decided to create a library for the DAL.

PostgreSQL has .NET support through the Npgsql library, an ADO.NET Data Provider for PostgreSQL. PgSqlLib is basically a wrapper around the Npgsql data provider to make writing the data access layer less verbose. The source code for the project can be found on GitHub. Documentation for setting up the library can be found below.

Using with ASP.NET Core WebAPI


In your Startup.cs file add the DataService dependency to the ConfigureServices method:

public void ConfigureServices(IServiceCollection services)
{
     // note code above omitted for example
     services.AddMvc();
     services.AddScoped<IDataService, DataService>();
}

Create a BaseController with a dependency on the DataService:

using Microsoft.AspNetCore.Mvc;
using PgSqlLib;

namespace YourAppNamespace.AppClasses
{
    public class BaseController : Controller
    {
        internal IDataService _DataService { get; set; }

        public BaseController(IDataService _dataService) 
        {
            _DataService = _dataService;
        }
    }
}

Controllers that inherit the BaseController class can then use the DataService with the PostgreSql database. For example, a controller for the model ModelName would have a method like below to get the object by id.

[HttpGet("{id}")]
public async Task<ModelName> Get(Guid id)
{
    ModelName modelName = await _DataService.ModelName.Get(id.ToString());
    
    if (modelName == null) 
    {
        Response.StatusCode = 404; // not found
    }

    return modelName;
}
 

Note: Since id could be a Guid or integer the Get method in the Repository has a type of string for the parameter. For your use case you may prefer to make parameter overloads for the Repistory<T>.Get method.

Configuring the DAL


Setting up the PostgreSql Database

In the directory /src/PLpgSql there are example stored procedures as well as an example table schema for setting up the database to work with the library. Since this library was built to not use Entity Framework, the table schema and stored procedures have to be scripted. The examples use PL/pgSQL, but any of the supported procedural languages can be used (PL/pgSQL, PL/Tcl, PL/Perl, or PL/Python).

Note: The schema and stored procedures are intended to be used as an example / template. For the models in your application you will have to alter them for your needs.

Models

Models in this library work just like they would in MVC or WebAPI, however there is an additional attribute needed for properties that map to table columns. The ColumnName attribute should be applied to these properties. For example the attribute [ColumnName("model_id")], with a string parameter corresponding to the PostgreSql column name, would be needed for a class property in a model.

PgSql Objects

Each list, get, save, and delete stored procedure created for models will need an entry in the corresponding Dictionary found in PgSqlLib.PgSql.PgSqlObjects with the appropriate parameters and name. An example for a get procedure by id is below.

private Dictionary<Type, PgSqlFunction> _getProcedures = null;
public  Dictionary<Type, PgSqlFunction> GetProcedures 
{ 
     get 
     {
         if (_getProcedures == null) 
         {
             _getProcedures = new Dictionary<Type, PgSqlFunction>
             {
                 // add get procedures here 
                 { 
                     typeof(ModelName),  new PgSqlFunction 
                     {
                         Name = "get_model_name_by_id",
                         Parameters = new NpgsqlParameter[] { PgSql.NpgParam(NpgsqlDbType.Uuid, "p_model_id") }
                     } 
                 }             
             };
         } // end if _getProcedures == null        
         return _getProcedures;
    }
}

Repository and DataService

For each model in the PostgreSql database an instance of the class PgSqlLib.Repository<T> should be added to the PgSqlLib.DataServce class. See the example below for adding a model called "ModelName" to the DataService.

public class DataService : IDataService
{
    // Add each model as property 
    public IRepository<ModelName> ModelName { get; set; }

    public DataService() 
    {
        // initialize the property to a new instance of the Repository class
        this.ModelName = new Repository<ModelName>(); 
    }
}

Mapping Objects to Models

For each model, code should be added to the PgSqlLib.App_Classes.Extensions.ToModel method to parse the NpgsqlDataReader to a Model class. Eventually mapping will be done using reflection (phase 2). See the example below for parsing data to the "ModelName" class.

public static T ToModel<T>(this DbDataReader @this) where T : class
{
    T objectCast = null;

    // return early if no data 
    if (!@this.HasRows || @this.FieldCount == 0)
        return objectCast;

    // map NpgsqlDataReader to ModelName type
    if (typeof (T) == typeof (ModelName) && objectCast == null) 
    {
        var modelName = new ModelName 
        {
            Id = Guid.Parse(@this["model_id"].ToString()),
            Name = @this["name"].ToString(),
            Description = @this["description"].ToString(),
            Created = @this["created"] != DBNull.Value ? DateTime.Parse(@this["created"].ToString()) : DateTime.MinValue,
            Updated = @this["updated"] != DBNull.Value ? (DateTime?)DateTime.Parse(@this["updated"].ToString()) : null,              
        };

        objectCast = modelName as T;
    }

    
    return objectCast;
}

Notes about Project


Eventually I plan on updating this library to use more reflection so there is less configuration involved. I may also switch the procedural language to Pg/Python.

An example of using this library in an application can be found in the Quiz-O-Matic application.

]]>
post/40 Mon, 03 Apr 2017 00:24:35 +0000
My Funny Valentine A take I did on the standard My Funny Valentine. I've been trying to play more music lately so hopefully there will be more recordings posted soon.

]]>
post/38 Sat, 25 Mar 2017 19:44:13 +0000
Razor Engine CMS : A CMS built with .NET and Razor Engine Project Source Code

The web application I work on at work has a custom CMS for the front end where all of the HTML, CSS/LESS, and JavaScript development is done. The templating framework used for building pages in the app was created by a previous developer and is written in PostScript.

The syntax of the language is reverse Polish notation, which means things are backwards. For example, to add the numbers “one” and “two,” it would be “1 2 +,” since the language is stack oriented. As you can imagine, this is kind of a pain to work with and hard to get new people up-to-speed, so I started looking into the idea of bringing Razor into our CMS. After a quick Google search, I found the framework Razor Engine, and that got me started.

The app I work with uses MVC in parts, but is really more of a SPA (Single Page Application), so most of the pages either get data through a stored procedure call using the PostScript templating framework or using Ajax. To pass data to Razor Engine, I decided to make another page field for a custom model. In this page field / section, you can write C# with access to the RazorEngineCms.PageModelClasses Library, where there are classes for getting includes, as well as data through calling stored procedures. The result of this Page Model section is then assigned to an anonymous object variable called Model which is passed to Razor Engine. Below is a screen shot showing this process:

razor-demo-1

Inside the Page Template field, you have access to the Razor templating language for manipulating data from the model. Below is a screen shot of what this page would look like once it is compiled:

razor-demo-2

Another benefit of using Razor in the CMS is that, unless a page uses a query string variable, the page can be pre-compiled with the compiled page stored in the database. For pages that do use URL parameters, a Caching process is in place to cache the compiled page for each permutation of possibilities. For example, the first time the page /example/sproc-call/22 is loaded, it will be dynamically compiled, but after that the page will be rendered from Cache.

Due to having to dynamically compile some of the pages, I came to the conclusion that in order for the CMS to be viable in a production environment there would have to be a background process to cleanup the temporary files created by both Razor Engine and dynamically compiling the model. For this reason, I decided that instead of trying to retro-fit Razor into our CMS, a better solution would be to work towards taking front end development out of the CMS and back into something like Visual Studio.

One of the main reasons for developing in a CMS is that we develop software for an international audience, so pages have to be loaded with different languages and cultural changes. In the current CMS, language resources and includes are database driven, but in the long run it would make more sense to bring things back into Visual Studio so resource files could be used. The downside of having database-driven language resources and includes is that for each include or phrase, a database call has to be made, which can result in some pages having around 100 separate database calls to load!

Overall, this was a fun project to work on, but I decided to put a fork in it as it was suffering from scope creep and I was tired of building a CMS.

]]>
post/37 Mon, 21 Nov 2016 22:32:40 +0000
Updating the site, blog, and Nginx to use Laravel 5 A few months ago I finished rebuilding the blog and CMS I use to manage the site with Laravel 5. The bulk of the business logic is in a repository pattern with a base class of CRUD methods that is used by the Users, Posts, and Comments repositories. Overall I really like the framework with the exception of creating routes being very verbose. As far as I could tell a route has to be specified for every controller method. There is not logic like in ASP.NET MVC where the controller endpoints are figured out using reflection. For example in ASP.NET MVC something like /{controller}/{action} can be specified which will be a route for all controller methods.

I updated the front-end to use the latest version of Bootstrap and kept it simple since I am mainly interested and work in back-end development.

Since my server hosting was expiring this month I also decided to switch from A Small Orange to Digital Ocean. So far Digital Ocean is awesome and creating your own Ubuntu server and configuring it to the LEMP (or LAMP) stack is easy with their tutorials and documentation.

To get Laravel 5 to work with Nginx the config /etc/nginx/sites-available/default needs the following edits:


     root /var/www/html/yourLaravel/public;

     location / {
          try_files $uri $uri/ /index.php?$query_string;
     }

     location ~ \.php$ {
          try_files $uri /index.php?$query_string /index.php =404;
          fastcgi_split_path_info ^(.+\.php)(/.+)$;
          fastcgi_pass 127.0.0.1:9000;
          fastcgi_index index.php;
          fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
          include fastcgi_params;
        }

Another update that is required as a result of the above Nginx change is to update the php fpm config file /etc/php/7.0/fpm/pool.d/www.conf and change

listen = /some/path/to/php/
to
listen = 127.0.0.1:9000

After making the above configuration changes remember to restart the services by running the following commands.

sudo service php7.0-fpm restart
sudo service nginx restart

The code for the updated website can be found here and the updated Laravel blog code is here.

]]>
post/36 Sun, 07 Aug 2016 23:37:06 +0000
Adding a namespace to jQuery extension functions The javascript library jQuery has the ability to extend functions, which allows you to create your own jQuery functions. To prevent your custom functions from having conflicting names it is a good idea to put them in their own namespace. One method of doing so is to create an object and put the functions in the object so they can be called like:
  $(yourSelector).yourNamespace.function()
I first tried to implement this idea but ran into the issue of not being able to access $(this)of the parent $(this).yourNamespace in the function itself since "this" has a different context in a function.

My resolution to this was to create one function that takes in a callback name that is called by that function. The different functions are stilled stored in an object, but since they are called by the function the correct context of $(this) can be used.

An example of calling a function using this approach would look like:

$(yourSelector).yourNamespace("function"); // where function is the name of a callback function in .namespace()

A callback function can be added to $(yourSelector).yourNamespace(); by adding to the callbacks object similar to below:

  // example callback function 
  callbacks.yourFunction = function (params) {
    if (params.length != 2) {
        return { msg : "invalid params", numParams : 1 }; // second param is default and always set 
   }
   var yourParam = params[0];
   var jquery = params[1];
   // do something with params 
   // example jquery html function call on $(this) of yourNamespace
   jquery.html("<div>" + yourParam + "</div>")
   return  { msg : "OK" };
  }

  // can be called like 
  $(".selector").yourNamespace("yourFunction", [ yourParam ]);

In the github at the end of the post there is another example that also shows how error logging works and some additional callback functions. If invalid parameters or an unknown callback function name are passed an error message is logged to the console using the logger object.

Github Repository

]]>
post/35 Mon, 06 Jun 2016 23:56:59 +0000
Compilation of songs I know on acoustic guitar

Due to the record being done with my phone it is not the best quality and needs to be turned up in order to hear. I left the recording pretty much raw except for doing a noise reduction to get rid of a static/clicking sound which helped the quality but made it even softer.

Check back for updates as to where this project is going. This list of side projects is only getting bigger, however I hope to finish this one over the summer.

]]>
post/32 Mon, 14 Jul 2014 06:09:02 +0000
Migrating a WordPress Multisite into AWS with EC2 + RDS Amazon Web Services. For performance and ease of maintenance we decided to use an EC2 instance to run the WordPress site with the database set up in RDS. Since I was leaving once the sandbox environment was set up I tried my best to document the setup process, which can be seen here.

Overall I am really impressed with AWS and the sandbox environment. The customer service is helpful, the amount of control over the servers is awesome, and the new site is fast. Page load times went from eight seconds (I know way too slow) to three seconds. Most of this is due to using Require.js and asynchronus calls with JS to parse RSS feeds, however I think moving from GoDaddy helped.

]]>
post/31 Tue, 01 Apr 2014 12:09:58 +0000
OneK - A challenge to create a webapp in under 1,000 Bytes of data post on Reddit about onekb.net where you can "Get one KiloByte of free webspace for just 0.00000001 bitcoin per hour!" The goal of the post was to see what you could create in under 1 KB of data. I came up with the idea of making a Reddit RSS reader (unoriginal idea, I know).

After I finished a version to host I realized that I cannot use any PHP in the site, just JS/HTML. I plan on making a version in JS in the future but right now it is in either PHP/XSL or just PHP. The latest version is being hosted on my site at addison.im/oneK/ and I also have reddit.onekb.net/ for another couple of hours.

Once on the site you can search by subreddit but just adding it on like it is another directory. For example to list only posts in /r/webdev/ you would write /oneK/webdev/

The site pulls in content from the RSS feed at http://reddit.com/.rss using a combination of PHP and XSLT (Extensible Stylesheet Language Transformations). The website I help develop at work uses RSS feeds to pass content around the site since WordPress automatically generates them for posts. At work today for some reason RSS feeds were not working inside of the building's network. When I noticed my site wasn't working I thought there was an issue with the XML to XSLT transformation so I rewrote the site to only use PHP (I actually prefer using just PHP to parse XML feeds but I thought using XSLT would be more concise, which I was wrong about), but the error persisted.

Here is the github repo for the site. The live version is way over the 1,000 Byte limit since I chose to add the subreddit feature. I plan on having a new version, under 1 KB in just PHP/CSS up soon as well as adding some more styles to the current version like a header/menu with subreddits.

]]>
post/30 Thu, 24 Apr 2014 02:21:15 +0000