Logging stuff is a good practice. It can even save your behind if things go wrong. However, sometimes the stuff you log isn’t directly usable. If you work with TIBCO Businessworks and like to use render-xml to store messages, this blog might be for you.

TIBCO Businessworks still treats everything like XML, even JSON payloads get parsed into their XML-oriented internal model. While this offers benefits like being able to use XPath on them, it also has the downside that it introduces the more rigid model parsing common with XML.

Another aspect of the XML history is that it offers more XML functions compared to JSON. The example I’m specifically pointing at is the “render-xml” function which, as the name suggests, is an inline function that renders an object into an XML string. This function is quite useful when you want to store or log an object as a string.

I use this function to log incoming payloads, even if those payloads are originally in JSON. From a logging perspective this isn’t a problem, but sometimes you need those logs to retry a payload. In some cases, you need them to retry a lot of payloads. While you could manually recreate the JSON payloads, that is both tedious and error prone. So, I wanted to automate it.

While there are several online XML to JSON converters, I didn’t want to siphon my data through some obscure website. So, I came across a Node.JS library that handles the heavy lifting of transforming XML into JSON. It’s called xml2json and you can find it here, or install it through “npm install xml2json”. While writing this I realized I could have also used TIBCO Businessworks to do this, like what my colleague did in this post, but at least this approach is usable for non-TIBCO users as well.

The NodeJS library offers several options on how to handle the transformation and one of the most useful was the option to get a JSON object as its output and not just a string. I used this to remove certain elements that are typical XML attributes that have no place in a JSON payload, specifically the xmlns attribute. Beware that other namespace declarations can result json elements with colons in their name. Use the correct javascript notation to get rid of those.

Since I had a lot of payloads, I went with the approach of using an input and output folder where each XML file in the Input folder results in a JSON file in the Output folder. I created the following “convertFolderToJSON.js” file for it (note the “rootElement” element you may want to remove or change):

var parser = require('xml2json');
 const path = require('path');
 const fs = require('fs');
var options = {
 object: true,
 reversible: false,
 coerce: true,
 sanitize: true,
 trim: true,
 arrayNotation: false,
 alternateTextNode: false
};

//joining path of directory
const directoryPath = path.join(__dirname, 'Input');
//passsing directoryPath and callback function
fs.readdir(directoryPath, function (err, files) {
  //handling error
  if (err) {
     return console.log('Unable to scan directory: ' + err);
  }

  //listing all files using forEach
  files.forEach(function (file, index, array) {
    // Do whatever you want to do with the file
    console.log("Reading: " + file);
    var xml = fs.readFileSync("Input/" + file, 'utf8');
    var json = parser.toJson(xml, options).rootElement;
    delete json.xmlns; // Remove XML namespace declaration
    //console.log(json);

    // Separate files output
    var outputFile = file.replace(".xml", ".json");
    fs.writeFile("Output/" + outputFile, JSON.stringify(json, null, 2), function(err) {
      if(err) {
        console.log(err);
      }
      console.log("Saved file: " + outputFile);
     });
   });
 });

Run using: node convertFolderToJSON.js

 

While this worked fine for me, it did mean I still had multiple separate files I would need to POST to my API. That sounded like work, so I figured I’d try and automate that as well.

Luckily my API tool of choice, Postman, offers the option to call an API several times using data from a file. However, it only accepts a single file as input. You can either use CSV (where the first line contains the variable names) or a JSON file. In the Postman call you can then provide the required variable name in the raw body input, using the name of the variable between double curly brackets (in my case it was {{postbody}}). When you save that call in a collection you can then use Runner to run it with data from a file.

I figured I’d start with a CSV containing the JSON payload on each line. Adjusting the code was simple enough (mainly involving replacing the writeFile function with the appendFile function) but Postman wasn’t having it. So, I moved to JSON.

Creating the JSON payload wasn’t exactly straightforward either. Besides not being able to just write out all the lines because I had to add opening and closing brackets etc., Postman also parsed my payload as an JSON Object and then inserted the string representation of the object (the string “[Object object]”) as the body. Very annoying.

With some trial and error it came down to writing the payload as a string and escaping all the double quotes to make the output proper JSON and result in a string that Postman parsed into a proper JSON string.

Adjusting the code, I came across some asynchronous behavior from NodeJS resulting in an incorrect order of data. Mainly the opening, closing and last entry (which didn’t require a comma at the end) were out of order. But adding some async and await commands remedied that situation. My console logging steps were still out of order but I accepted my loss there. The result is the following “convertFolderToJSON.js” file:

//var convert = require('xml-js');
 var parser = require('xml2json');
 const path = require('path');
 const fs = require('fs');

var options = {
 object: true,
 reversible: false,
 coerce: true,
 sanitize: true,
 trim: true,
 arrayNotation: false,
 alternateTextNode: false
 };

 //joining path of directory
 const directoryPath = path.join(__dirname, 'Input');
 //passsing directoryPath and callback function
 fs.readdir(directoryPath, async function (err, files) {
   //handling error
   if (err) {
     return console.log('Unable to scan directory: ' + err);
   }

   //Prepping single output file
   var outputFile = "Output/output.json"
   await fs.writeFile(outputFile, "[\r\n", function(err) {
    if(err) {
       console.log(err);
    }
    //console.log("Created file: " + outputFile);
   });
   //listing all files using forEach
   files.forEach(async function (file, index, array) {
     // Do whatever you want to do with the file
     console.log("Reading: " + file);
     var xml = fs.readFileSync("Input/" + file, 'utf8');
     var json = parser.toJson(xml, options).rootElement;
     delete json.xmlns; // Remove XML namespace declaration
     //console.log(json);

     var lineEnd = ",\r\n";
     if (index === array.length - 1) lineEnd = "\r\n";
     var entry = { postbody: JSON.stringify(json).replace('"', '\"')};
     await fs.appendFile(outputFile, JSON.stringify(entry) + lineEnd, function(err) {
     if(err) {
        console.log(err);
     }
     console.log("Added file: " + file);
    });
   });
   await fs.appendFile(outputFile, "]", function(err) {
   if(err) {
   console.log(err);
   }
   //console.log("Finished file: " + outputFile);
  });
 });
 Run using: node convertFolderToJSON.js

This resulted in a single output that Postman was happy to process. With it I was able to resubmit all the payload I could find in my logs. And now, you can too. Hope it helped!

Houd jij je kennis graag up to date?

Mis niets meer van onze kennisdocumenten, events, blogs en cases: ontvang als eerste het laatste nieuws in je inbox!

Fijn dat we je op de hoogte mogen houden!