Graphing United States Road/Highway Lane Mileage by State and Type Of Road

Thursday, December 27, 2018 by Nate Bross

There are a few data visualization projects I want to tackle and a good number of them have a cartography component. I knew I had to get started with one and build on it or I'd never get any of them rolling. I ran across a 'Functional System Lane Length' data set from Federal Highway Administration. I figured the table could be cleaned up and presented in a more intuitive way using a map and it would force me to kick start this little prototype.

This is what I came up with, its still a work in progress, though updates will likely come slowly and will be applied to the next data visualization project. This is how I approached it.

Visualization Technology

I had worked with jqvmap on some previous projects, and they have a great sample for color coding a map based on a set of data, so I decided to stick with something I already knew. I also knew I wanted to throw this in a separate page that I could embed on my blog here, so I wanted to make sure it could be completely self contained.

Data Processing

The data was available in an html table form, as well as pdf and excel. I found an online html table to json converter, and used that to build an array of data from the table. This got me close to what I wanted, but it left a bad taste in my mouth indexing into a multi dimensional array to pull out data every time I wanted to change data sets. This is what it looked like before any conversion:

[
    ["INTERSTATE","OTHER FREEWAYS AND EXPRESSWAYS","OTHER PRINCIPAL ARTERIAL","MINOR ARTERIAL","MAJOR COLLECTOR","MINOR COLLECTOR (2)","LOCAL (2)","TOTAL","INTERSTATE","OTHER FREEWAYS AND EXPRESSWAYS","OTHER PRINCIPAL ARTERIAL","MINOR ARTERIAL","MAJOR COLLECTOR","MINOR COLLECTOR","LOCAL (2)","TOTAL","",""  ],
    ["Alabama","2,414","-","6,045","8,398","24,615","12,441","94,725","148,638","2,189","138","4,905","6,070","7,547","372","41,481","62,701","211,339"  ],
    ["Alaska","2,057","-","1,612","867","2,743","2,862","15,294","25,434","303","-","503","475","510","472","3,898","6,162","31,597"  ],
    ["Arizona","3,714","70","3,417","2,691","8,552","3,789","61,350","83,584","1,462","1,552","3,771","9,278","4,352","456","40,505","61,375","144,959"  ],
    ["Arkansas","1,752","288","4,996","6,360","23,780","13,656","122,548","173,378","1,469","424","2,208","4,364","4,504","499","23,685","37,154","210,532"  ],
    ["California","5,254","1,518","8,253","12,647","24,037","15,080","82,233","149,022","9,671","9,283","25,149","30,758","27,593","644","142,263","245,361","394,383"  ],
]

Clearly, something needed done, so I wrote a small processing utility and JSON.stringify'd it:

var ddt = [];
for (var i = 0; i < states.length; i++) {
    var abvr = StateAbvrFromName(states[i][0]); // since data shows state name and jqvmap uses state code, il, ca, etc
    ddt[ddt.length] = {
        'State': { 'Name' : states[i][0], 'Code': abvr },
        'Rural': {
            'Interstate': parseInt(states[i][1].replace(/,/g, ""), 10),
            'Other_Freeways_Expressways': parseInt(states[i][2].replace(/,/g, ""), 10),
            'Other_Principal_Arterial': parseInt(states[i][3].replace(/,/g, ""), 10),
            'Minor_Arterial': parseInt(states[i][4].replace(/,/g, ""), 10),
            'Major_Collector':parseInt(states[i][5].replace(/,/g, ""), 10),
            'Minor_Collector_2': parseInt(states[i][6].replace(/,/g, ""), 10),
            'Local_2': parseInt(states[i][7].replace(/,/g, ""), 10),
            'Total': parseInt(states[i][8].replace(/,/g, ""), 10),
        },
        'Urban': {
            'Interstate': parseInt(states[i][9].replace(/,/g, ""), 10),
            'Other_Freeways_Expressways': parseInt(states[i][10].replace(/,/g, ""), 10),
            'Other_Principal_Arterial': parseInt(states[i][11].replace(/,/g, ""), 10),
            'Minor_Arterial': parseInt(states[i][12].replace(/,/g, ""), 10),
            'Major_Collector': parseInt(states[i][13].replace(/,/g, ""), 10),
            'Minor_Collector_2': parseInt(states[i][14].replace(/,/g, ""), 10),
            'Local_2': parseInt(states[i][15].replace(/,/g, ""), 10),
            'Total': parseInt(states[i][16].replace(/,/g, ""), 10),
        },
        'Total' : parseInt(states[i][17].replace(/,/g, ""), 10),
    };
}

Throwing It All Together

Going in to this little project, I knew that I wanted to run without any server side code, I wanted users to be able to minipulate the data to change the map, and I knew I wanted it to be as light weight as possible. I didn't want any node modules or components or any 'build' time tooling to be required to make this work.

I opted to use some jQuery event handling to glue everything together, and I'm actually happy with how it worked out.

 

 

My Great Grandma's Buttermilk Pancakes

Saturday, November 17, 2018 by Nate Bross

The ingredients:

  • 2 cups flour
  • 2 tsp baking powder
  • 1 tsp salt
  • 1/2 tsp baking soda
  • 3 Tbs sugar
  • 2 eggs, separated
  • 2 cups buttermilk (or more)
  • 1/4 cup melted butter

Sift flour, measure and re-sift with other dry ingredients. Combine beaten yolks with buttermilk and add to dry ingredients. Add melted butter. Beat egg whites until stiff and fold into mixture. Bake on hot waffle iron or make into pancakes.

Introducting my latest side project: FMData

Wednesday, September 5, 2018 by Nate Bross

I switched jobs in March of this year, that is a story for a different time, but the important thing is that it reintegrated me with some technology I had not used in quite some time: FileMaker. FileMaker is a database system that provides UI and basic scripting capabilities all in a single package. Users use the FileMaker client to access data stored in FileMaker databases, utilizing layouts and scripts in the database. Its an all in one system.

As a web developer with a lot of C# code under my belt, I wanted to connect to data in FileMaker from the outside. Historically the way to do this was through their XML Publishing Engine, I picked up an existing open source project and modified it to suit my needs. Ten years ago, this was a great solution and it worked well. It still does, but as things change so must we. In FileMaker 17 the Data API was introduced. It uses JSON and is RESTful. A new package was needed. Enter my side project: FMData. I built this as a learning exercise, but quickly realized it could be useful to other developers.

The library provides coverage for the FileMaker 17 Data API, and I've just released v2.1, cleaning up a handful of bugs and improving coverage of the underlying FileMaker API. I still don't have full coverage, but I'm inching towards it. I have tons of ideas for features and enhancements, so be sure to keep an eye on the project. If you find it useful let me know. If you find a bug, open an issue. If you have a feature idea, open an issue on github and consider making a pull request.

The package is available on Nuget, and getting some data is really this simple:

var client = new FileMakerRestClient("server", "fileName", "user", "pass"); // without .fmp12
var toFind = new Model { Name = "someName" };
var results = await client.FindAsync(toFind);
// results = IEnumerable<Model> matching with Name field matching "someName" as a FileMaker Findrequest.

That's all there is to it, and there's more examples on the project site: https://fmdata.io/

Continuous Integration for Open Source Projects .NET Projects

Thursday, May 3, 2018 by Nate Bross

I operate a couple niche open source projects. They don't generate much activity, but they've been useful to me over the years so I share them with the world to help anyone else that happens upon them. 

They're hosted on my GitHub page. Which is great for sharing the source code and allowing folks to submit issues and submit pull requests (not that my projects are big enough to get any real activity, but I can hope). There isn't a good way to share the binary output from GitHub. You need to utilize additional tools and software. I'm using AppVeyor and MyGet and I outline my configuration below.

The full CI setup could be achieved with MyGet alone since they also offer build services; but I'm using a combination of MyGet (pre-release package hosting and AppVeyor for builds).

AppVeyor Setup

In order to get my .NET Standard 2.0 library to build in AppVeyor I had to make a few changes from the default configuration. 

Build Setup

build configuration: build nuget packages, do dotnet restore pre build

On the build configuration tab you need to tick the box to build Nuget Packages, and most importantly add a pre-build script to perform

dotnet restore

Deployment Setup

Deployment tab on the left as part of the build, not part of the AppVeyor project across the top.

On the Settings >> Deployment tab, in order to push to MyGet you will need to provide the MyGet Feed Url and API key. Both of these are easy to obtain on your feeds detail page. 

MyGet

There are plenty of resources for setting up a MyGet feed, so I'm not going into those details, but this is where you get the settings utilized in AppVeyor:

myget nuget push url

The last step is pushing the MyGet packages up to Nuget; which can be done directly through the MyGet interface. Right now, this is a manual process for me. I have two separate AppVeyor builds setup for the same project, pushing to the same MyGet feed. One connected to the develop branch and one linked to master. Within AppVeyor I have enabled assembly version patching so they all end up in the MyGet feed and I can push the master releases out to Nuget.

I'm looking into having the build create release tags in the repository after a successful build, but haven't figured out how I want that to work yet.

 

Fix for Remote Desktop Connection Manager (RDCMan) on High DPI Devices

Tuesday, November 28, 2017 by Nate Bross

If you're like me, you probably find yourself needing to remote into servers from time to time. Again, if you're like me, you probably got tired of doing it manually and found a tool to help you. I know did, I found and live by RDCMan.

One of the many beautiful features, is it gives you the ability to store an encrypted file with all your connection, display, settings along with credentials. So you're only a short double click away from being on the remote desktop you need to be on.

Its been working for me for years. In the last few years, High-DPI devices have become more common and RDCMan didn't play well. That is, by default. One simple operating system setting/configuration was all it took to get sorted out. 

Simply go to the properties of your shortcut, on the compatibility tab, change the HighDPI drop down from Application to System.

   

I'm rather disappointed it took me this long to figure out, but now that I have it working its fantastic and my remote sessions are no longer scaled way down to fit.

 View More Posts

Top Posts

About Me


Proud Contributor 

profile for Nate on Stack Exchange, a network of free, community-driven Q&A sites


Thoughts, opinions, and ideas shared here are my own. All content © Nate Bross 2019.