All posts by caulfiek

Sniffing xml messages stored in SQL Server with xPath.

If you have a database table with actual xml messages stored as a column value on a record (an xml payload) you can use the “.nodes” method to inspect the xml in each record using xpath queries.

Lets say you have a table of records that store details on interest rate swap trade….

The FpML schema for a IR Swap is here:

CROSS Apply the results of your xPath queries to SQL placeholder aliases such then can then be used as regular SQL data column values in the rest of the query:

-- IRD XCCY Swap, find me a fixed fixed IRD Swap 
WITH xmlnamespaces( '' AS fpml, 
'' AS xsi, 
'' AS kc )
SELECT trade_id, update_time, trade_event
,x.col.value('.', 'VARCHAR(100)') [productType]
,y.col.value('.', 'VARCHAR(100)') [fixRate1]
,z.col.value('.', 'VARCHAR(100)') [fixRate2]
FROM [dbo].[tradeMessages] 
CROSS apply xml_trade_payload.nodes('//fpml:swap/fpml:productType') x(col) 
CROSS apply xml_trade_payload.nodes('//fpml:swapStream[1]/fpml:calculationPeriodAmount/fpml:calculation/fpml:fixedRateSchedule/fpml:initialValue') y(col) 
CROSS apply xml_trade_payload.nodes('//fpml:swapStream[2]/fpml:calculationPeriodAmount/fpml:calculation/fpml:fixedRateSchedule/fpml:initialValue') z(col) 
where asset_class = 'IRD' 
	and product_type = 'Interest Rate Swap' 
	and update_time >= CONVERT(datetime, '2018-01-01')
	and trade_event = 'ADD'
(x.col.value('.', 'VARCHAR(100)') != y.col.value('.', 'VARCHAR(100)')
or y.col.value('.', 'VARCHAR(100)') != z.col.value('.', 'VARCHAR(100)')
or x.col.value('.', 'VARCHAR(100)') != z.col.value('.', 'VARCHAR(100)'))
order by unique_trade_id, seq_id desc

This will find all trades added to a TradeMessages table that have 2 fixed rate schedules (a fix/fix swap), that were traded in 2018.

Nice link here to CROSS Apply:

Mask FpML with xslt Identity Transform

This script will copy a FpML xml file and match and transform the fpml:tradeid to add the current date to the node value.

<xsl:stylesheet version="2.0" 

<!--K Caulfield, Sep 2018
	Stylesheet to transform FpML XML messages into FpML XML messages with the trade id details prefixed to make then unique for repeated testing -->

	<xsl:output method="xml" encoding="UTF-8" indent="yes"/>
	<xsl:strip-space elements="*"/>

<!-- define the prefix we want to use for setting up a text cycyle -->
<!--<xsl:variable name="trade_prefix" select="'20180925_01_'" />-->
<xsl:variable name="trade_prefix" select="concat(fn:format-date(fn:current-date(),'[Y0001][M01][D01]'),'_','01','_')" />
<!-- Identity Transform -->
<xsl:template match="@* | node()">
      <xsl:apply-templates select="@* | node()"/>

<xsl:template match="//fpml:tradeHeader/fpml:partyTradeIdentifier/fpml:tradeId[../fpml:partyReference/@href='party1']/text()">
	<xsl:call-template name="kc:prefix"><xsl:with-param name="nodeVal" select="." /></xsl:call-template>

<xsl:template name="kc:prefix">
	<xsl:param name="nodeVal" />
		<xsl:value-of select="concat($trade_prefix,$nodeVal)" />


Script to transform with saxon .net

Nice DOS script to transform the most recent xml file in a directory, using net.sf.saxon.Transform

REM Script to call Saxon xslt Parser for most recent file in XML directory and save results to HTML/Output.html
rem get the latest file in the XML directory
for /f "delims=" %%i in ('dir /od /b *.xml') do (
	Set recentXMLFile=%%i
CD ..
Set xmlFile="XML\%recentXMLFile%"
Set xsltFile=GU_UI_NoNULLv2.xslt
Set outFile=HTML\Output.html
java -cp saxonHE9-8-0-8j\saxon9he.jar net.sf.saxon.Transform -t -s:%xmlFile% -xsl:%xsltFile% -o:%outFile% --suppressXsltNamespaceCheck:on
IF ERRORLEVEL 1 goto :bad  
REM now open the html file
start "" %outFile%
goto :end
REM deal with error level
echo A Bad Thing Happened

Transform a folder of XML files

If you ever need to bulk transform a directory of xml files, and output each transformed file into a new directory then Saxon has a great command line interface tool that will do just that, in one simple line!

CD C:\Users\caulfiek\Testing java -jar c:\Users\caulfiek\XSLT\saxonHE9-8-0-8j\saxon9he.jar -s:sourceDir\ -xsl:FPML_Masking.xslt -o:targetDir
Here I have  a stylesheet that performs an identity transform on a xml file, but uses xpath to find and mask certain xml node values (like customer name and transaction number).
Download the saxonHE9 cli tool from here:
Create a directory “sourceDir” and “targetDir”
Fill your “sourceDir” with lots of XML files and set up the command as above on your DOS command prompt.


Linn LP12 Upgrade to Hercules II Power Supply

Today I received my new Hercules II power supply for my Linn LP12 (1982, pre Valhalla).

I’ve put some pictures here so you can see how to install, which is made slightly more complicated as you’ll need to drill out the holes for the stand-offs for the PSU card.

So lets plug the spindle reservoir, so the bearing oil doesn’t spill out. Apparently there’s a cap you can use but I simply used clingfilm taped to the top.

Plug the spindle reservoir.
Plug the spindle reservoir.

Now we can place the turntable upside down on some cushions on a well lit surface (I used by Range Master cooker as the hood light is very bright).

Upturned LP12
Upturned LP12

OK lets get the back off and have a look at the old PSU.

Original PSU
Original PSU

We can disconnect the motor wires, which are soldered on, and the wires to the switch.

Original PSU
Original PSU

I’ve popped out the power switch, which was relatively easy as there where no corner braces. So now we have a bare cross brace.

cross brace LP12
cross brace LP12

With a silver Sharpie pen I made marks for the holes for the spacers on the Hercules II psu. Then I drilled 6 holes in my £600 turntable !!!.

cross brace LP12 with holes and risers.
cross brace LP12 with holes and risers.

Well after a stiff coffee I figured I’m nearly there. Next simply place the psu on the spacers and gently push it on.

Hercules II attached to LP12
Hercules II attached to LP12

Connect it up :-).

Hercules II PSU attached to LP12.
Hercules II PSU attached to LP12.

Done! Now find something I haven’t listened to for ages.

Blue light for 45 rpm
Blue light for 45 rpm

This took me just over an hour to do and I’m very pleased with the results. I think everything has just gotten that little tighter, the music is more precise and bass punchy.

This is a cheap way to upgrade an old LP12 and I can definitely recommend it.


Hello world! goodbye Joomla!

We’ll after many years on Joomla 1.7, I’ve had to give in as it’s a leaky as a leaky thing, and I’ve been hacked twice now.

Why anyone would want to do this is beyond my wonder really, it’s only a small blag with no importance and must be pretty easy to hack.

However I have learned something out of this:

  • htaccess is a nasty file and you need to keep it well protected.
  • put well difficult passwords on your ftp account and your MySQL account.
  • don’t use an “admin” user id for admin!
  • pick a CRM or Blog platform that you can keep up to date with latest version and take regular backups.

So with that in mind I’ve plunged for the latest version of WP and it seems much much easier to use than Joomla.

So get ready for lots of changes in Themes over the next few months until I get bored with them and probably stick to the first one that I’ve used! ha.


Recursively read UNC path and count line for specific file

Here is a script that will count the number of lines in each FooBar.csv file it find in all subdirectories.

I want to also show the parent folder for each FooBar.csv file.


# Script to count the lines in each FooBar.csv file, so determining the total open orders each day.

$outFile = “FooBarCount.csv”

If (Test-Path $outFile) {
    Del $outFile

echo ‘Directory,LineCount’ > $outFile

$List = Get-ChildItem -path “\\<servername>\d$\Archive\Main” -Recurse -Include FooBar.csv
foreach ($file in $List)
 $result = Get-Content $file | Measure-Object -Line | Select-Object -ExpandProperty Lines;

    $directory = Split-Path $file.fullname -Parent #get the directory full name
    $parent = Split-Path $directory -Leaf # get the last directory folder name

    $printline = $parent + ‘,’ + $result

    echo $printline
    echo $printline >> $outFile

Formatting datestamps on LOAD DATA

I have a csv (of Control M batch data) that I need to load into a MySQL table, but it has a selection of timestamps and dates that I need to format, so that they can be accepted by the MySQL TIMESTAMP and DATA data types.

Data Example…

20141031,20141031,MY_GROUP,MY_JOB,01/11/2014 00:30,01/11/2014 00:30,Ended OK

Here’s my code with the appropriate formatting:

 LOAD DATA INFILE 'D:/Build/MySQLDump/ControlM stats for Kieran.csv' INTO TABLE stats FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' (title1,title2,report_gen,title3,title4,title5,title6,title7,title8,title9,title10,title11,title12,title13,@order_date,title15,job_name,@start_time,@end_time,job_status,title16) SET order_date = STR_TO_DATE(@order_date, '%Y%m%d'), start_time = STR_TO_DATE(@start_time, '%d/%m/%Y %H:%i'), end_time = STR_TO_DATE(@end_time, '%d/%m/%Y %H:%i');

I’ve used the @variable placeholder so I can reference the column name in the SET command.

 ‘%Y%m%d’ : will format 20141031 correctly.

  ‘%d/%m/%Y %H:%i’ : will format 01/11/2014 00:30 correctly. Note the uppper “H” for 24Hour Format.





Remove Whitespace using Perl

If you need to remove white space from say a CSV file that has been spooled by SQLPLUS, then try this:

 perl -pi.bak -e "s/^ *//g;s/ *,/,/g;s/, */,/g" C:\working\Batch\spooled_output.csv

This will remove leading spaces and any spaces in between comma separated values.

It will also create a *.bak file of your original.


ODBC TNSNAMES Dropdown Gobbledygook and windings

On Windows 7 ODBC Administrator, if you add a new Oracle connection in System DSN, and the TNS Selector shows a list of gobbledygook or Windings, then check your %TNS_ADMIN% System Environment Variable.

Chances are it is not pointing to where the Network/Admin directory is (ie where your tnsname.ora is located).

Simply update with the correct path and you’re sorted.

(right click computer, Properties->Advanced System Settings->Environment Variables->System Variables)