Building Some ServiceNow Developer Utilities – Part 4

Since I last blogged, I’ve been busy working on several utility projects, reading several books, and I started working out during my lunch break (since no one at the office plays board games anymore).  I figured I’d share a few of those projects and ideas today.


This isn’t anything groundbreaking or amazing, just a simple script include that converts JSON objects to GlideRecords and GlideRecords to JSON objects.

I mainly made this for my unit testing.  It’s a lot easier to set up test records with JSON objects than numerous GlideRecord.setValue statements.


How many of you use some sort of metadata in custom tables to drive code rather than hard-coded values in a script?  How many of you have ever forgotten to include the most recent values in a migration document, and had defects arise because of it?  Since one of the principles of the Pragmatic Programmer (#61) is to not use manual procedures, I figured I’d be a good little developer and automate it.

I added a field to the Update Set table called “u_import_data”.  It’s a List field that references the Table table.  I didn’t put any restrictions on it, so technically one could put incidents, change requests, or any other table in that to migrate forward.

Next, I created a REST Message called “Automated Data Retrieval”.  The endpoint is as follows:


(I’m still on Dublin, so in Fuji, the end point would be different: https://${environment}${table_name})

I changed the post method to have “&sysparm_action=insert” appended to the base URL.  (In Fuji you won’t need that.)

Next, I created a script include to handle the work: SNDataCopy.  This script include accepts the utility class I developed (JSONtoGlide) and uses the REST message to retrieve data based on a list of tables, then insert (or update) the data.  The JSONtoGlide can be omitted or replaced by a utility of your choosing, and the script include will still work.

Finally, I created a business rule on the Retrieved Update Sets table called “Automatically Run Import Data on Commit”.  It runs after a remote update set changes to “committed”.  It takes the u_import_data field and runs SNDataCopy with it.  Voila.  You now have automatic data transfer when update sets commit.  The update set for this work is here.


This is something I’ve made, but I’m not sure how much it will be used.  It was more of a “let’s see if I can do this” project.

Basically, the code from “Retrieve Completed Update Sets”, “Preview Update Set”, “Resolve Update Set Problems”, and “Commit Update Set” are all sort of repurposed in the SNBuild script include.  With some carefully timed business rules, an event and a script action, and a REST Message, we can post a new event to another ServiceNow environment, retrieve update sets once the event runs, preview the update sets once they are loaded, handle any problems with the update set automatically, and then commit the update set.

Handling the problems really comes down to personal preference, but I told my code to “skip” anything where a local update was found that was newer than what was in the update set, and to accept everything else.

There are three important system properties to note: sync.child.environment, sync.parent.environment, and sync.allowed.environment.  The allowed environment property allows you to restrict what environments the update sets can be automated to.  For instance, many of us wouldn’t want update sets to go straight to production!

A final touch I added was a “Deploy” UI Action, which pushes an update set up to it’s parent.

Other Experiments


I took a look at integrating with Box.  I know at Knowledge15, they showed such an integration with Geneva, but who knows when my instance will be on Geneva.  It wasn’t that hard to get an API key and programatically post a file to Box.  I think the trick here would be using the refresh key constantly, and creating a robust, resilient program.  I don’t have the time for that in my weekly maintenance, so that can sit on hold for now.


Automatically documenting a workflow image (with my source code extraction) would be a pretty cool thing.  I’ve got some concept ideas, but again, time is not my friend in this project.  Perhaps in the coming months I can dig into that.

Rally – SDLC

I’m also knee-deep in a Rally Integration, that both dumps Rally data into ServiceNow and updates Rally when you change something in ServiceNow.  I still need to come up with a better way of integrating Rally changes into ServiceNow.  Currently I know of only two ways: polling Rally at an interval, or using Rally notifications to alert ServiceNow something has change, and then ServiceNow can go find out what it was to update.

Jenkins – CI/CD – Change Request

I played around with a Jenkins integration some, and it’s promising.  I’m not sure if it offers tremendous value to the organization right now, but it is pretty neat.  It gets information about all Jenkins objects (folders, jobs), individual builds, and the changesets associated with those builds (ie: what changed in Bitbucket).  If I can combine this somehow with a SonarQube integration, I think I’ll have something good.  The other thing this integration offers is the ability to build a job remotely from ServiceNow, assuming it is set up to do so within Jenkins.  I already use this to run my test automation from ServiceNow as a UI Action.

SauceLabs – Test Management

I’ve briefly played with the SauceLabs API.  The next part of my Rally Integration is to extract Test Plans/Cases into ServiceNow Test Management, which looks much better than Rally’s test management.  Once I get some data in, I plan on associating test cases with Sauce Test names, so automated test results are dumped into the system for a one-stop shop of sorts.

Reading List

At the start, I mentioned some books I had been reading:

  1. The Pragmatic Programmer
  2. The Agile Samurai
  3. Mastering Javascript Design Patterns

All of them have been great reads, and I look forward to applying them in my work.


2 thoughts on “Building Some ServiceNow Developer Utilities – Part 4

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s