## Thursday, 22 June 2017

I got the following (unhelpful) message in my server log, when I changed some of my Java classes that are translated to JSON (and vice versa).

Turns out that I added a specific constructor to one of my Java classes, effectively removing the unspecified Default Constructor that Java always adds.

This default constructor is however essential to the proper working of JSON-Java mapping.

## Thursday, 15 June 2017

I am using EJBs as REST Services. It works pretty well. I added security on the EJB, by means of security definitions in the web.xml file and appropriate annotations on the EJB (@DeclareRoles and @RolesAllowed).

Unfortunately, when I try to access the methods in the EJB without being properly authorized, I received a 500 BadRequest. Instead I would really like to have a 401 Unauthorized.

I posted a question on StackOverflow1, but I have found the solution2 in the mean time, which I also posted, and will repost here.

It is possible to add an ExceptionMapper to your Application, which can map between an Exception and an appropriate HTTP Response.

# Note

My ApplicationConfig has now been expanded with a
.

# References

[2] RESTfu­­l Jav­a­ wit­h ­JAX­-­­RS 2.­0­ (Second Edition) - Exception Handling
https://dennis-xlc.gitbooks.io/restful-java-with-jax-rs-2-0-2rd-edition/en/part1/chapter7/exception_handling.html
StackOverflow -
https://stackoverflow.com/questions/3297048/403-forbidden-vs-401-unauthorized-http-responses

## Thursday, 8 June 2017

### Casting JSON Object to TypeScript Class

I have implemented some HTTP service for my Angular App using the explanation at [1]. Now in resource [2] it is mentioned that it is important to provide the JSON Object received from the HTTP Service in the constructor of the data model.

I thought I had found a shortcut. I thought that as long as the JSON object received resembled the structure of the TypeScript class, that I could just cast it to the TypeScript class.

This worked fine, until it didn't, and then I got this huge error in my face.

# The problem

The problem started appearing when I defined a method in my TypeScript class. Naturally, this method is not available in the JSON Object, and no manner of Casting is going to make it magically appear there.

You get something like:
ERROR TypeError: item.getItemPriceAsInteger is not a function
at ItemService.webpackJsonp.71.ItemService.updateItem (http://localhost.com/main.bundle.js:811:67)
at ItemSettingsComponent.webpackJsonp.183.ItemSettingsComponent.update (http://localhost.com/main.bundle.js:508:28)
at ItemSettingsComponent.webpackJsonp.183.ItemSettingsComponent.saveItem (http://localhost.com/main.bundle.js:480:14)
at Object.eval [as handleEvent] (ng:///AppModule/ItemSettingsComponent.ngfactory.js:1663:24)
at handleEvent (http://localhost.com/vendor.bundle.js:13600:138)
at callWithDebugContext (http://localhost.com/vendor.bundle.js:14892:42)
at Object.debugHandleEvent [as handleEvent] (http://localhost.com/vendor.bundle.js:14480:12)
at dispatchEvent (http://localhost.com/vendor.bundle.js:10500:21)
at http://localhost.com/vendor.bundle.js:12428:20
at SafeSubscriber.schedulerFn [as _next] (http://localhost.com/vendor.bundle.js:5549:36)

# Solutions

There are several solutions available as described in [3, 4, 5].

# Chosen solution

I like the one provided in [6]. It uses TypeScript Decorators7. It can be installed as an npm package, according to [8].

To anyone using Java, the solution provided has an uncanny resemblance to JPA annotated Entities or JAXB annotated classes.

I am going to go ahead and try this one out, and see how it works.

I'll provide an update, once I get some results.

# References

[1] Angular Docs - HTTP Client
https://angular.io/docs/ts/latest/guide/server-communication.html
[2] Writing a Search Result
ng-book 2 - The Complete Book on Angular Nate Murray, Felipe Coury, Ari Lerner, Carlos Taborda
[3] StackOverflow - How do I cast a JSON object to a typescript class
https://stackoverflow.com/questions/22875636/how-do-i-cast-a-json-object-to-a-typescript-class
[4] StackOverflow - Angular2 cast a json result to an interface
https://stackoverflow.com/questions/34516332/angular2-cast-a-json-result-to-an-interface
[5] Angular2 HTTP GET - Cast response into full object
https://stackoverflow.com/questions/36014161/angular2-http-get-cast-response-into-full-object
[6] Mark Galae - TypeScript Json Mapper
http://cloudmark.github.io/Json-Mapping/
[7] TypeScript - Decorators
https://www.typescriptlang.org/docs/handbook/decorators.html
Ninja Tips 2 - Make your JSON typed with TypeScript
[8] npm - json-typescript-mapper
https://www.npmjs.com/package/json-typescript-mapper

## Thursday, 1 June 2017

### Bower

Wow. On the website for bower1, they mention the following quote:
“ ...psst! While Bower is maintained, we recommend yarn and webpack for new front-end projects!”2 3
Damn, it's hard to keep up with the advancements in Front-end Land!

# References

[1] Bower - A package manager for the web
https://bower.io/
Yarn - Fast, reliable, and secure dependency management.
https://yarnpkg.com/en/
webpack MODULE BUNDLER
https://webpack.github.io/

### flexibleJDBCRealm

I have recently changed my security realm settings, and I thought I'd document them here.

I'm still using the flexibleJDBCRealm1 as I've documented in previous blogs2,3.

In the Glassfish administration console, under Configurations -> server-config -> Security -> Realms -> myRealm, the settings are now as follows.
NameValueDescription
datasource.jndijdbc/mydbthe data source to my database
jaas.contextflexibleJdbcRealm
sql.groupsselect groupid from mmv_groups where name in (?)using a database view, makes it easier to change table layout without effecting the securityrealm

# Note

The SHA-512 encoding always creates 128 characters as the hash.

However, in the source code of the flexibleJDBCRealm, this hash is converted from a byte[] into a hexadecimal string by means of a call "new BigInteger(1, aData).toString(16);".

This effectively means that if the byte[] starts with one or more "0"s, these are removed in the BigInteger call leaving you with a hash that is less than 128 characters.

This is why I need to use "HEX:128", instead of just "HEX".

The values are easily verifiable in the database.

I can just do a
SELECT SHA2(usertable.password, 512) from usertable where user='mrbear';

It should yield the exact same result as the encryption function of the flexibleJDBCRealm.

# References

[1] FlexibleJDBCRealm
http://flexiblejdbcrealm.wamblee.org/site/
[2] Security Realms in Glassfish
http://randomthoughtsonjavaprogramming.blogspot.nl/2016/04/security-realms-in-glassfish.html
[3] Glassfish Security Realms
http://randomthoughtsonjavaprogramming.blogspot.nl/2014/10/glassfish-security-realms.html
[4] Installation instructions
http://flexiblejdbcrealm.wamblee.org/site/documentation/snapshot/installation.html

## Thursday, 25 May 2017

### "this" in JavaScript/TypeScript

I have been struggling with using "this" in JavaScript, ever since I got into that area of programming.

There are lots of warnings on the web, where programmers who are used to a certain behaviour regarding "this" (Like me) can fall into this trap.

I recently found some really good resources that explain it.

There's one1 that explains it a little regarding "this" in JavaScript.

But as I have been writing in TypeScript, I was looking for an explanation that focuses on TypeScript and helps me find the best solution to deal with this. I found that one in [2].

# For example

So I've got some code that could use a bit of a look-over.

Here's the troublesome bit.

TypeScript has an excellent Tutorial, which I've used time and again to write my things. One of the pages I've used is the explanation regarding HTTP which you can find at [3].

In it they mention a "handleError" method, which can handle HTTP errors of the PlayerService. Convenient, so I used it. It works.

Next, I wished for the handleError method in the PlayerService that takes care of HTTP connections to notify the ErrorsService. So naturally, I inject the ErrorsService into the PlayerService.

Unfortunately, in the handleError, the ErrorsService is 'undefined'. (See line 30 in the gist below)

It is explained in reference [2] why this is, but I like the following quote:
“The biggest red flag you can keep in mind is the use of a class method without immediately invoking it. Any time you see a class method being referenced without being invoked as part of that same expression, this might be incorrect.”
Now there are several solutions for this described in [2].

The solution below is what I came up with on my own, and I don't really like it, but it works.

# Local Fat Arrow

I prefer the solution called the "Local Fat Arrow", which looks like this:
I love it!

# References

[1] Mozilla Developer Network - javascript:this
https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Operators/this
[2] Github/Microsoft/TypeScript - 'this'in TypeScript
https://github.com/Microsoft/TypeScript/wiki/'this'-in-TypeScript
[3] ts GUIDE - HTTP CLIENT
https://angular.io/docs/ts/latest/guide/server-communication.html

## Thursday, 18 May 2017

Small followup of From Hibernate to Eclipselink1 post.

I am not entirely satisfied about the AdditionalCriteria4 thingy. I find it a chore to have to set a parameter on the EntityManager all the time to enable/disable it.

Biggest issue for me is that parameters set on the EntityManager are required. If they are omitted, an exception is thrown when querying.

Current solution in my software:
Turn the AdditionalCriteria on or off by means of a parameter that needs to be set on the EntityManager.

Looks like this:
Setting the parameter activePersonFilter can be done on the EntityManager as follows:
@PersistenceContext(properties =
{
@PersistenceProperty(name = "activePersonFilter", value = "0"),
@PersistenceProperty(name = "sundaydateFilter", value = "")
})
private EntityManager em;
Or
entityManager.setProperty("activePersonFilter", 0);

# Other solutions

There are some other solutions.
1. You can remove the additionalCriteria (set it to "") in a subclass, and use the subclass specifically. See [2].
2. You can customize any mapping in EclipseLink and add the requirements/conditions that you need. See [3].
3. I could just decide to create a view on the offending database table. Then create two entities. Sounds very similar to the first option.
4. I could solve the problem in software. Just have EclipseLink not filter anything. (Which is silly, I don't wish for my ORM to get the 1000 persons in the room from the database, if there are say only three persons active.)
5. I could remove the collection entirely, and retrieve the required Persons using a NamedQuery. (Which is bogus. I like the ORM to deal with this for me, instead of having to do it myself. It's what the ORM is for.)

# Customizing a Mapping

I have recently decided to try to customize the mapping specifically in Entities that have collections containing instances of Person class. That way I have more control. See reference [3] on how this works.

It requires a @Customizer annotation.

For instance, in a Room I only wish to see the active persons.

This requires me to define the PersonsFilterForRoom as follows.
"persons"
the name of the field that contains the collection
"room"
the name of the field in the Entity of the collection
"id"
the name of the field in the Room entity that identifies it
It works pretty good.

# Note

I also noticed that this way I could have two (Lazy! That's the important bit!) Collections in the same Entity at the same time referring to the same Person. One will contain all Persons and one will contain only the Active Persons.

This is ideal, for instance for Guilds.

Like so:
This way the customizer PersonsFilterForGuild is designed to only work on the activeMembers collection.

I like it!

# References

[2] StackOverflow - Disable additional criteria only in some entity relations
[3] Mapping Selection Criteria

## Friday, 12 May 2017

A small blog this time.

At work we sometimes have serious problems with non-deterministic tests.

Martin Fowler mentioned how this can be prevented or dealt with.1

I also noticed that these non-deterministic tests are (almost...) always in the end-to-end tests (or the functional tests, or however you wish to call them).

Martin Fowler also has something to say about those2

# References

[1] MartinFowler - Eradicating Non-Determinism in Tests
https://martinfowler.com/articles/nonDeterminism.html
[2] MartinFowler - TestPyramid
https://martinfowler.com/bliki/TestPyramid.html

## Thursday, 4 May 2017

### REST-assured

I am a card-carrying member of the NLJUG0, which provides Java Magazine (not the Oracle one) six times per year.

One of the issues contained an article about REST-assured1.

I have been using SoapUI5 to test my REST services, and that works fine. It's a nice graphical userinterface for me to fiddle with parameters and urls and HTTP requests and even write tests.

I am aware that it is probably possible to integrate SoapUI into my Build Pipeline, but I was really looking for something different. Something more in the line of programming, which is of course my forte. Something I could use in my unit-tests.

REST-assured was exactly what I needed and let me tell you, it's great!

# Usage

I will provide an example of how I use it.

As you can see, REST-assured is a very nice DSL (Domain Specific Language) and reads easily.

Some explanation of the above:
log().ifValidationFails()
I wish to log stuff, if the validation/test fails, so I can find out what is wrong. The output looks like
param(name, value)
for setting parameters at the end of the url, like ?stuff=value
pathParam(tag, value)
replaces {tag} in your url with the values. Convenient!
request methods
in the example above, we are using the PUT HTTP Request.
As it is used for testing, it is possible to verify the values afterwards. In the above this is visible as we expect to receive a 204 (NO_CONTENT).

We can extract the response, as is done above, to verify for example the json payload (if there is one) or get cookie values.

In the above example it is essential for the followup calls that we get the JSESSIONID cookie out of the request.

In subsequent REST calls, it is obvious that we need to send along the same JSESSIONID cookie.

# Some notes

I tried to send parameters, but a POST defaults to FORM parameters in the body, but I already have a BODY. But using "queryParam" instead of "param" fixes this problem.

I do enjoy using the "prettyPrint" method on a Response, to properly format a JSON body and dump it to standard output and see what I get. It's nice.

Getting some values out of your JSON formatted response body does require some serious work, though. Needs more research.

I am not entirely sure, I do not enjoy using http status codes like 200 or 204. I prefer something more readable like "NO_CONTENT", but I suppose I can deal with it myself. No biggy.

Update 14/05/2017: I'm also slightly sorry to find out that rest-assured includes Hamcrest. I prefer AssertJ at the moment myself.

# Postscriptum

The article in Java Magazine also mentioned WireMock3.

Though I do not use it, it seems excellent for testing the other side of the communications, if you need to test a client that communicates with a server via rest calls.

# References

[0] NLJUG
http://www.nljug.org/
[1] REST-assured
Teije van Sloten Java Magazine | 01 2017
[2] GitHub - Java DSL for easy testing of REST services
https://github.com/rest-assured/rest-assured
[3] WireMock
http://wiremock.org/
[4] GitHub - RestAssured Usage
https://github.com/rest-assured/rest-assured/wiki/usage
[5] SoapUI
https://www.soapui.org/
Testing REST Endpoints Using REST Assured
https://semaphoreci.com/community/tutorials/testing-rest-endpoints-using-rest-assured
RFC2616 - HTTP status codes
https://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html

## Thursday, 27 April 2017

### Cucumber @After en @Before Hooks

We're using Cucumber at work to write tests, end-to-end-tests that access the user interface of the web application using Selenium.

I recently added an @After hook to a class that contained my StepDefinitions.

However, this @After hook was also called by all other scenarios1, which was not my intention.

As a matter of fact, that @After I added was executing similar code as an @After in another StepDefinition class. I verified that both @After annotated methods were executed for each and every scenario, and they were.

So I decided to move all @After annotated methods into a "GlobalStepDefinition" class, and collaps all of them into one method.

Incidentally, reference [3] shows why we should not have many of these end-to-end tests.

# References

[1] GitHub Issues - Before and After methods invoked for unused step definition classes #1005
https://github.com/cucumber/cucumber-jvm/issues/1005
[2] Cucumber - Polymorphic Step Definitions
https://cucumber.io/blog/2015/07/08/polymorphic-step-definitions
[3] MartinFowler.com - TestPyramid
https://martinfowler.com/bliki/TestPyramid.html

## Sunday, 23 April 2017

### Problems with Resolution and My Monitor in Fedora Core 25

Well, my monitor always has been a bit of a problem child, but it worked, so I didn't mind.

I let it bounce once on the floor, but besides some slight discolouring in the lower-right corner, it was fine.

It reports EDID settings that are completely crap, but I got used to ignoring those, using xrandr.

# XRandr settings that work for me

The following settings work:
xrandr --newmode "1920x1440" 339.50  1920 2072 2280 2640  1440 1443 1447 1514 -
xrandr --newmode "1600x1200" 235.00  1600 1728 1896 2192  1200 1203 1207 1262 -
xrandr --newmode "1280x1024"  159.50  1280 1376 1512 1744  1024 1027 1034 1078
xrandr --output VGA-0 --mode 1920x1440

# Problem

Then I upgraded to Fedora Core 25, and my monitor showed me a handsome 1024x768, which was a disappointment to say the least. (I'm used to 1920x1440.)

Using xrandr gave me the cryptic error message:
bash-4.3$xrandr --output XWAYLAND0 --mode "1920x1440" xrandr: Configure crtc 0 failed After some research I noticed that Fedora Core 25 is the first one to use Wayland1 as the default. # Solution Switching back to the old Xorg2 fixed my problem. # Checking graphics card bash-4.3$  lspci -nnk |grep -A 3 -i vga
01:00.0 VGA compatible controller [0300]: Advanced Micro Devices, Inc. [AMD/ATI] Juniper XT [Radeon HD 5770] [1002:68b8]
Subsystem: ASUSTeK Computer Inc. Device [1043:0344]

# References

[1] Wayland Desktop
https://wayland.freedesktop.org/
[2] Fedora Project - Switching back to Xorg
https://fedoraproject.org/wiki/Changes/WaylandByDefault
Fedoraforum.org - how to install amd/ati driver on fedora 25?
AskFedora - How to add a custom resolution to Weyland Fedora 25?
ArchLinux - Forcing modes and EDID
https://wiki.archlinux.org/index.php/Kernel_mode_setting#Forcing_modes_and_EDID
Bugzilla Redhat - My Bugreport
https://bugzilla.redhat.com/show_bug.cgi?id=1443761

## Saturday, 15 April 2017

### Keyset pagination

In the past I have used the MySQL equivalent of pagination. In other words, the splitting up of a ResultSet into pages of a fixed number of entries, by means of using SQL1.

It looks like the following:
SELECT * FROM tbl LIMIT 5,10;  # Retrieve rows 6-15
For compatibility with PostgreSQL, MySQL also supports the LIMIT row_count OFFSET offset syntax, which I've used in the past.

# Performance

Performance is a key point here, as MySQL requires the retrieval of the results in order to determine where the offset starts.

If the table is large, retrieval of pages at the end of the table are going to be extremely slow.

# Solution

A better way to deal with this, is to not use an offset, but use the key of the last row of the previous page, and use that in the query for the next page.

Obviously this only works if the resultset is sorted.

For more references that explain this a lot better, see [2] and [3].

# References

[1] MySQL 5.7 - 14.2.9. SELECT Syntax
https://dev.mysql.com/doc/refman/5.7/en/select.html
[2] Use the Index, Luke! - We need tool support for keyset pagination
http://use-the-index-luke.com/no-offset
[3] Use the Index, Luke! - Paging Through Results
http://use-the-index-luke.com/sql/partial-results/fetch-next-page

## Thursday, 6 April 2017

### Try Git

To anyone who is absolutely new to the exciting new world of Git1.

There seems to be a little website where you can try Git2, working in a (very very) limited sandbox environment.

# What is Git?

If you wish to know what Git is, there are loads of interesting articles on teh interwebs that explain it very well.

But I did find the following explanation in the README provided with the source tar-ball:
The name "git" was given by Linus Torvalds when he wrote the very
first version. He described the tool as "the stupid content tracker"
and the name as (depending on your mood):

- random three-letter combination that is pronounceable, and not
actually used by any common UNIX command.  The fact that it is a
mispronunciation of "get" may or may not be relevant.
- stupid. contemptible and despicable. simple. Take your pick from the
dictionary of slang.
- "global information tracker": you're in a good mood, and it actually
works for you. Angels sing, and a light suddenly fills the room.
- "goddamn idiotic truckload of sh*t": when it breaks

# References

[1] Git --distributed-is-the-new-centralized
https://git-scm.com/
[2] Try Git
https://try.github.io/

## Thursday, 30 March 2017

### Setting session timeout in Glassfish

People complained that their sessions timed-out too quickly in Glassfish.

I checked and it is set to 30 minutes (default 1800 seconds), just a tad too little.

Increased it to 2 hours (7200 seconds).

Just went to Configurations - Web Container - Session Properties - Session Timeout.

It changes the domain.xml:
<session-properties timeout-in-seconds="7200"></session-properties>

# Problem

Of course, this completely and utterly failed to work in my case.

It turns out I already had a session timeout specified in the web.xml.
<session-config>
<session-timeout>
30
</session-timeout>
</session-config>
The session timeout in the web.xml is specified in minutes.

You can also specify it in the glassfish-web.xml file.1
<session-config>
<session-properties>
<property name="timeoutSeconds" value="600"/>
</session-properties>
</session-config>

# Precedence

You do need to check which setting takes precedence in your application. It's not clear from the documentation.

# References

[1] Glassfish 4.0 Application Deployment Guide
https://glassfish.java.net/docs/4.0/application-deployment-guide.pdf
iT Geek Help - Glassfish web container tuning settings
http://itgeekhelp.blogspot.nl/2009/03/glassfish-web-container-tuning-settings.html
StackOverflow - How to set session timeout in glassfish-web.xml configuration file?
http://stackoverflow.com/questions/33067985/how-to-set-session-timeout-in-glassfish-web-app-glassfish-web-xml-configurat

## Thursday, 23 March 2017

### AssertJ vs. Hamcrest

I recently came across a piece of code that used a Stack1. The Stack seems to inherit from Vector. The JavaDoc indicated (and so did my IDE, I think) that I should be using the Deque2 interface instead. To be precise:
“A more complete and consistent set of LIFO stack operations is provided by the Deque interface and its implementations, which should be used in preference to this class.”
Dequeue basically seems to be a specialized Queue3, that supports element insertion and removal at both ends4.

In order to get to grips with Deque, I decided to write some simple tests. These are JUnit Tests (version 4.12) and in one I used Hamcrest5 and in the other I went for AssertJ6.

Let's see what happens.

# A simple compare

Hamcrest:
assertThat(actual, equalTo(testdata2));
AssertJ:
assertThat(actual).isEqualTo(testdata2);

# Collections

Hamcrest:
assertThat(transmittedTestdata, hasSize(2));
AssertJ:
assertThat(transmittedTestdata).size().isEqualTo(2);

# Null Values

Hamcrest:
assertThat(actual, not(nullValue()));
AssertJ:
assertThat(actual).isNotNull();

# Exceptions

Hamcrest:
@Test(expected = NoSuchElementException.class)
public void testEmptyDequeueException()
{
Testdata pop = transmittedTestdata.pop();
}
AssertJ:
assertThatThrownBy(transmittedTestdata::pop).isInstanceOf(NoSuchElementException.class);

# Imports

A comparison between the required imports of Hamcrest and Assertj is interesting:
Hamcrest:
import java.util.Deque;
import java.util.NoSuchElementException;
import static org.hamcrest.CoreMatchers.equalTo;
import static org.hamcrest.CoreMatchers.not;
import static org.hamcrest.CoreMatchers.nullValue;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.empty;
import static org.hamcrest.Matchers.hasSize;
import org.junit.After;
import org.junit.AfterClass;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.Test;
AssertJ:
import java.util.Deque;
import java.util.NoSuchElementException;
import static org.assertj.core.api.Assertions.assertThat;
import static org.assertj.core.api.Assertions.assertThatThrownBy;
import org.junit.After;
import org.junit.AfterClass;
import org.junit.Before;
import org.junit.BeforeClass;
import org.junit.Test;

# Notes

• I really like the AssertJ fluent API. It feels more natural to me than the Hamcrest one.
• It is way easier to find the appropriate matchers in AssertJ. I get the full benefit of my IDE code completion.
• Adding the appropriate import is way easier. Using Hamcrest, I always get a choice of five different imports for the same matcher.
• I need fewer imports anyways.
So far, I like AssertJ a lot.

I need to work with AssertJ a lot more, to see some of the interesting stuff.

# References

[1] Java 7 JavaDoc - Stack
https://docs.oracle.com/javase/7/docs/api/java/util/Stack.html
[2] Java 7 JavaDoc - Deque
https://docs.oracle.com/javase/7/docs/api/java/util/Deque.html
[3] Java 7 JavaDoc - Queue
https://docs.oracle.com/javase/7/docs/api/java/util/Queue.html
[4] Wikipedia - Double-ended queue
https://en.wikipedia.org/wiki/Double-ended_queue
[5] Hamcrest - Matchers that can be combined to create flexible expressions of intent
http://hamcrest.org/
[6] AssertJ - Quick start
http://joel-costigliola.github.io/assertj/assertj-core-quick-start.html

## Thursday, 16 March 2017

### reveal.js

Our architect recently put together a presentation regarding our new framework using reveal.js1.

I had never heard of reveal.js and I was intrigued. It seems to be a presentation framework that runs in your webbrowser, using npm2 and grunt3 and javascript and MarkDown4 and all that.

I figured I'd give it a try for my next presentation.

I downloaded a release6 and used the very clear instructions on how it works on GitHub5.

Installing a new release, seems to be nothing more than:
- unzip
- edit index.html
- browse to index.html

Luckily, I had the changes our architect made to bring it into line with the company layout guidelines. It was nothing more than a different css file that is based on the "white"-theme (which is also a css file). The default theme when you get a release is the "black"-theme, similar to the one visible at [1].

You can decide to just browse to the file index.html locally to display the presentation, but if you do a "grunt serve" a small webserver is started that serves the webpage and related resources. The latter option provides more functionality.

# MarkDown

<div class="reveal">
<div class="slides">
<section data-markdown="slides.md"
data-separator="^\n\n\n"
data-separator-vertical="^\n\n"
data-separator-notes="^Note:"
data-charset="utf-8">

</section>
</div>
</div>
As you can see above, you can specify how the sheets are divided. What exactly the sequence is for detecting a division.

Initializing the presentation is done using:
// More info https://github.com/hakimel/reveal.js#configuration
Reveal.initialize({
width : 1280,
height : 1024,
slideNumber: 'c/t',
showNotes: true,
history: true,
// More info https://github.com/hakimel/reveal.js#dependencies
dependencies: [
{ src: 'plugin/markdown/marked.js' },
{ src: 'plugin/markdown/markdown.js' },
{ src: 'plugin/notes/notes.js', async: true },
{ src: 'plugin/zoom-js/zoom.js', async: true } }
]
});
I set the "snowNotes" to true, because I wished to print out the sheets including the notes. See Printing below.

# CSS

It is easy to add custom CSS to individual slides. For example:
<!-- .slide: data-background="#ffffff" data-background-image="images/background_subtitle.png"  data-background-size="auto 100%"  data-background-repeat="no-repeat" -->
A common one used to change the font of the previous element (useful for source code):
<!-- .element: class="small" -->

# Printing

Printing your sheets seems to be as simple as surfing to the url:
http://localhost:8080/?print-pdf#/
It generated in Chrome browser a PDF file that you can simply print to file in the browser.

# Conclusions

It is very nice, if you are a programmer or web guy and you do not wish to fire up Microsoft Powerpoint.

An advantage is of course that MarkDown files can easily be added to your version control system.

Another advantage is that you can refer to images on the Internet/Intranet. I managed to do just that, by referring to images already on our Intranet Confluence pages. At least the images will always be up to date.

(p.s. It also means that in order to view my presentation properly, one has to be logged into Confluence. I found that out rather quickly, when trying my presentation out in one of our conference rooms.)

I don't really like the markdown setting displayed above, as it is too easy to add one line or remove one line to many.

I also had a problem where I must have made a grammatical mistake, and in my FireFox browser the presentation managed to hang and after several seconds I'd get a "Script is running too long. Do something about it?" message.

There are several keyboard shortcuts for navigation through the sheets during the presentation, which is nice, as the mouse isn't all that handy.

I don't much like the "sheet notes", which are displayed in a separate browser window. I usually have them turned off.

# References

[1] Reveal.ks - the HTML Presentation Framework
http://lab.hakim.se/reveal-js/
[2] NPM
https://www.npmjs.com/
[3] Grunt
https://gruntjs.com/
[4] Wikipedia - MarkDown
https://en.wikipedia.org/wiki/Markdown
[5] GitHub - reveal.js
https://github.com/hakimel/reveal.js
[6] reveal.js releases
https://github.com/hakimel/reveal.js/releases
GitHub- Basic Writing and Formatting Syntax
https://help.github.com/articles/basic-writing-and-formatting-syntax/

## Thursday, 9 March 2017

### Git Stash

This little blog post is just for me to remember my favorite "git stash" commands. It took me a little while to actually use the stash, but that is because IntelliJ provides a similar functionality called "shelving", which I had used all this time.

I use branches a lot when using Git, and the problem there is that Git usually complains if I wish to change branches, while I still have uncommitted changes in my current branch. Therefore the "stash" command is for me very valuable.

$git stash Get your uncommitted changes back from the stash:$ git stash apply

Get a list of your current stashes:
$git stash list stash@{0}: WIP on master: 049d078 added the index file stash@{1}: WIP on master: c264051 Revert "added file_size" stash@{2}: WIP on master: 21d80a5 added number to log Remove a no longer needed stash:$ git stash drop stash@{0}
Dropped stash@{0} (364e91f3f268f0900bc3ee613f9f733e82aaed43)
One command I particularly like is this one that does both an apply of your stash and once done automatically removes it from the list of stashes:
git stash pop

The stash has a lot of similarities to your standard Stack implementation (or Dequeue, depending on your point of view.)

I notice that if I do not clean up the place or use the "pop" subcommand, that my list of stashes tends to grow quite long unobtrusively.

# References

6.3 Git Tools - Stashing
https://git-scm.com/book/en/v1/Git-Tools-Stashing
Atlassian Tutorials - Git Stash
https://www.atlassian.com/git/tutorials/git-stash
Ariejan De Vroom - GIT: Using the stash
https://ariejan.net/2008/04/23/git-using-the-stash/
Git Stash - Man Page
https://git-scm.com/docs/git-stash

## Thursday, 2 March 2017

### Maven and the Dangers of Snapshots

Recently we've been causing problems in the regular builds of branches of our software.

Basically the problem is our own fault and is related to Maven Snapshots.

According to the guide1, a Snapshot is a library that is still under development, and may change rapidly as new versions of the Snapshot are pushed to the Nexus regularly.

If a dependency on a Snapshot is defined in your pom.xml, then Maven, as it should, always picks the latest Snapshot.

This is fine and dandy if you are currently developing your software, and you want the newest of the new of the libraries that your other software teams are developing.

# The Problem

It means that once you create a stable release of your software (and the appropriate Git branch for it to live in as well, of course) it is important to replace the Snapshot in the pom.xml with the appropriate released version.

We neglected to do just that.

# The Consequence

Our branch containing the release version of our software suddenly bombed with compile errors in the Deployment Pipeline.

This caused the maintenance people a headache, as the Git revision of the branch had not changed, between the previous build (which compiled just fine) and the new build (which bombed).

Despite the build being pulled from Git with the exact same revision, it was technically different from the previous build.

All because we kept developing the Snapshot and pushing it into the Nexus.

# What we should have done

• create a proper release of the library
• change the pom.xml in the branch to refer to this release.
• create a new snapshot of the library
• use the new snapshot in the pom.xml of the master branch (which is used for development)
Now the build of both the branch as well as the master should compile again.

# References

[1] Apache Maven - Getting Started
https://maven.apache.org/guides/getting-started/
Continuous Releasing of Maven Artifacts
https://dzone.com/articles/continuous-releasing-maven

## Thursday, 23 February 2017

### A Natural Progression Towards Lambda

At my work, in order to deal with a grid1 in the frontend and a list at the backend, we use a DataModel at the backend.

It seems simple enough, and used to work as follows:
private List<Person> list = Arrays.asList(new Person("Jim"), new Person("Jack"));

private ListDataModel<Person> dataModel = new ListDataModel<>(list);
This had some shortcomings when for example the user decided to select a different department, executing this code:
list = findPersonsByDepartment(department);
This seems to work just fine. A person selects a different department, and the employees data model updates itself. Or so one would think.
What happens is that the ListDataModel retains the old list. So, the frontend is never updated.

# Reusing the same list

Because of this little problem, our code retains a lot of the following statements, to make sure the same list is used over and over again:
list.clear();
It seems a slightly convoluted way to doing things.

# Anonymous inner classes

We soon found out that anonymous inner classes would solve this problem better, and in fact there are more anonymous inner classes than there are named DataModels in our current code base.

It looks like the following:
private ListDataModel<Person> dataModel = new ListDataModel<Person>()
{
@Override
public List<Person> getList()
{
return findPersonsByDepartment(department);
}
};
There now, any time the contents of the ListDataModel is requested in the frontend, a new and accurate List containing the department employees is returned.

# Passing code

Instead of creating an entire new anonymous inner subclass of a ListDataModel, it might be more elegant to create an interface especially for this purpose, call it the ListProvider interface.

As follows:
public interface ListProvider<T>
{
List<T> getList();
}

private ListDataModel<Person> dataModel = new ListDataModel<Person>(new ListProvider<>()
{
@Override
public List<Person> getList()
{
return findPersonsByDepartment(department);
}
});

# Using lambdas

The good part is that now with Java 8 we can start using Lambdas.

And in this case, we have an interface containing just one method. This is in essence the definition of a lambda.

So now the proper way to write this would be the following:
public interface ListProvider<T>
{
List<T> getList();
}

private ListDataModel<Person> dataModel =
new ListDataModel<Person>(() -> findPersonsByDepartment(department));
Convenient, isn't it?
In this case, the lambda is called a Supplier2 .

# References

[1] Welcome to the SlickGrid! (outdated sadly)
https://github.com/mleibman/SlickGrid/wiki
[2] Supplier (Java Platform SE 8)
https://docs.oracle.com/javase/8/docs/api/java/util/function/Supplier.html

## Thursday, 16 February 2017

### Angular - Semantic Versioning

Angular1 has switched to Semantic Versioning2

So, the brand new thing that is totally hot right now is Angular 4.0.

The versions released, and to be released are available here3.

Contrary to the image in the blog, the word for referring to all this is "Angular".

Looks like version 5 of Angular will be released later in the year.

I hope I can keep up.

# References

[1] Ok... let me explain: it's going to be Angular 4.0, or just Angular
http://angularjs.blogspot.nl/2016/12/ok-let-me-explain-its-going-to-be.html?inf_contact_key=8b9c809bd7a11da8e78370dff6483f15f2782c6760b6b1b77f6b008bc3804655&m=1
[2] Semantic Versioning 2.0.0
http://semver.org/
[3] Versioning and Releasing Angular
http://angularjs.blogspot.nl/2016/10/versioning-and-releasing-angular.html#Timebased_release_cycles_18

## Thursday, 9 February 2017

### Group by problem with Hibernate

Recently had a small problem that the group by function didn't work, if I added a subtable to the query. The GROUP BY expression did not match any longer.

Seems a long standing problem with Hibernate.

# References

Java Persistence with Hibernate, page 392
Christian Bauer, Gavin King, Gary Gregoy
HH-1615 - GROUP BY entity does not work
https://hibernate.atlassian.net/browse/HHH-1615
HH2436 - Allow grouping by entity reference (per JPA spec)
https://hibernate.atlassian.net/browse/HHH-2436

## Wednesday, 1 February 2017

### Extending SSL Certificate in Glassfish

This is a followup of the blog post SSL Certificates in Glassfish.

The reason for this followup, is that signing of websites and code seems to be a very error prone and manual process, that is done infrequently enough for all of us to forget afterwards.

It basically follows the same path as the previous blog post, but I find it convenient to write stuff down, in case I forget.

Now my certificate on my website had expired, and it took me a while, before I found the time and the motivation to extend the certificate.

I'm still with GoDaddy.com4. Thankfully, the CSR was already transmitted last year, and I can just reuse that one.

Once I submit the CSR, I am required to verify that I am the owner of the Domain. This time, thank goodness, it requires nothing more than the clicking of a link sent to the email address that is stored in the WHOIS information.

Nothing like putting a file in the rootmap of the webserver or some such, like the first time.

Once that is done, I need to download the new certificates from godaddy.com. They ask for the type of web server that they need to generate the certificates for. Glassfish is not mentioned anywhere, so I select "Other".

The zip file I then receive, contains the same files as mentioned in my previous blogpost1.

As I already installed all the root certificates, I choose to ignore the gd_bundle-g2-g1.crt file.

The more interesting file is the 2375839yrghfs5e7f.crt file.

## Replace the original self-signed certificate with the certificate you obtained from the CA

[glassfish@server config]$keytool -import -v -trustcacerts -alias s1as -file /home/glassfish/junk/2375839yrghfs5e7f.crt -keystore keystore.jks -storepass changeit Certificate reply was installed in keystore [Storing keystore.jks] ## Verifying the keystore.jks You can verify that all is well, by using the above command to check the keystore. You will see something like the following: Alias name: s1as Creation date: Feb 1, 2017 Entry type: PrivateKeyEntry Certificate chain length: 4 Certificate[1]: Owner: CN=www.server.org, OU=Domain Control Validated Issuer: CN=Go Daddy Secure Certificate Authority - G2, OU=http://certs.godaddy.com/repository/, O="GoDaddy.com, Inc.", L=Scottsdale, ST=Arizona, C=US Serial number: 8446c5db57d376ed Valid from: Wed Feb 01 14:27:00 CET 2017 until: Thu Feb 01 14:27:00 CET 2018 Certificate fingerprints: MD5: 75:7a:73:67:72:6a:6b:73:65:72:6e:79:20:62:61:77 SHA1: 75:7a:73:67:72:6a:6b:73:65:72:6e:79:20:62:61:77:79:20:72:67 SHA256: 75:7a:73:67:72:6a:6b:73:65:72:6e:79:20:62:61:77:79:20:72:67:68:20:61:77:65:72:3c:6f:3b:20:59:38 Signature algorithm name: SHA256withRSA Version: 3 Which shows that as of today, the keystore has a valid certificate that is exactly valid for one year. To apply your changes, restart GlassFish Server, according to chapter "To Sign a Certificate by Using keytool2". ## Verifying after reboot Earlier, when issuing the openssl command: openssl s_client -connect www.server.org:4848 The result was: SSL handshake has read 15360 bytes and written 339 bytes --- New, TLSv1/SSLv3, Cipher is ECDHE-RSA-AES256-GCM-SHA384 Server public key is 2048 bit Secure Renegotiation IS supported Compression: NONE Expansion: NONE No ALPN negotiated SSL-Session: Protocol : TLSv1.2 Cipher : ECDHE-RSA-AES256-GCM-SHA384 Session-ID: 5891E20F7C4FA7CBFA6ABF7E0EC6EC2D40C2CB4A148EFCEAE7F3179F5F80763F Session-ID-ctx: Master-Key: B8C7BA7AC15244DC581749AC9702609F8EB1BCE03F5B0CD53ECEE382D93877EBF6D5E3FE9F603D6D8253521A29EEB494 Key-Arg : None Krb5 Principal: None PSK identity: None PSK identity hint: None Start Time: 1485956532 Timeout : 300 (sec) Verify return code: 10 (certificate has expired) --- Notice especially that last bit. Once the glassfish was rebooted, the same command yields: SSL handshake has read 15370 bytes and written 339 bytes --- New, TLSv1/SSLv3, Cipher is ECDHE-RSA-AES256-GCM-SHA384 Server public key is 2048 bit Secure Renegotiation IS supported Compression: NONE Expansion: NONE No ALPN negotiated SSL-Session: Protocol : TLSv1.2 Cipher : ECDHE-RSA-AES256-GCM-SHA384 Session-ID: 5891E99B097CCC082475F5949A55ABD71C7AED902725AA6E98E77EAA3FC7BF01 Session-ID-ctx: Master-Key: 9465D76CDC8D4CA19E46B2367ECD35382BA8049707BBF1D4D06E0389E85F724BA646F3C2C9FD45CF256C12ED9A0714F0 Key-Arg : None Krb5 Principal: None PSK identity: None PSK identity hint: None Start Time: 1485958464 Timeout : 300 (sec) Verify return code: 0 (ok) --- Again, I would like to draw your attention to the last line. And that's it for now! # References [1] SSL Certificates in Glassfish http://randomthoughtsonjavaprogramming.blogspot.nl/2015/10/ssl-certificates-in-glassfish.html [2] GlassFish Server Open Source Edition Security Guide Release 4.0 https://glassfish.java.net/docs/4.0/security-guide.pdf [3] GlassFish Server Open Source Edition Administration Guide Release 4.0 https://glassfish.java.net/docs/4.0/administration-guide.pdf [4] GoDaddy: Hosting, domainregistration, websites and more... http://www.godaddy.com SSLShopper - most common java keytool keystore commands https://www.sslshopper.com/article-most-common-java-keytool-keystore-commands.html SSLShopper - SSL Certificate Verification https://www.sslshopper.com/ssl-checker.html ## Saturday, 28 January 2017 ### Race Condition in JavaScript Promises I recently encountered a race condition in the use of my promises. It seems trivial, but these little mistakes have a tendency to cause a lot of debugging time. getUrls(): ng.IPromise<any> { if (this.urls != null) { var urls = this.urls; return this.$q(function (resolve, reject) { resolve(urls); });
}
this.urls = {};
var urls: any = this.urls;
return this.$http.get("conf/urls.json").then(function (response: any) { urls.productionUrl = response.data.productionUrl; urls.testingUrl = response.data.testingUrl; return response.data; }); } Can you spot the problem? The problem occurs when two threads access the same method at the same time. Click here for the answer. I always have problems wrapping my mind around JavaScript promises. ## Thursday, 19 January 2017 ### Rewriting History with Git Well, this is basically a followup of my previous blogpost about git. A note of warning: rewriting history can be tricky, and you should perform this only on your local Git repository on things you haven't yet pushed to a remote (public) repository. For more detailed information on how you can work with it, see the References. Right here, right now, I'm going to provide the way I've used it in my current work. # Logs Rewriting history is done by "rebasing" your current checkins. It helps if you can easily retrieve your current checkins from the logs. The following shows my last 5 checkins in my local branch. [mrbear@localhost project]$ git log --pretty=format:"%h %s" HEAD~5..HEAD
75d7620 BUGS-0010 Make it visible whether user is using the test or the production version.
ca9217e BUGS-0010 Cache test/prod urls.
b1f93c1 BUGS-0010 Replace hardcoded urls with configuration.
46908ce BUGS-0010 Implement configuration using Gulp.
49115c3 BUGS-0010 Two ways: "gulp test" or "gulp production".
Bear in mind that the log shows the checkins from most recent (on top) to the least recent (last). Rebasing takes the checkins in the opposite order.

# Squashing checkins

Let us try to squash some commits that we've made together into one single commit.
[mrbear@localhost project]\$ git rebase -i HEAD~5
49115c3 BUGS-0010 Two ways: "gulp test" or "gulp production".
46908ce BUGS-0010 Implement configuration using Gulp.
b1f93c1 BUGS-0010 Replace hardcoded urls with configuration.
ca9217e BUGS-0010 Cache test/prod urls.
75d7620 BUGS-0010 Make it visible whether user is using the test or the production version.

# Rebase d5defcb..75d7620 onto d5defcb (5 command(s))
#
# Commands:
# p, pick = use commit
# r, reword = use commit, but edit the commit message
# e, edit = use commit, but stop for amending
# s, squash = use commit, but meld into previous commit
# f, fixup = like "squash", but discard this commit's log message
# x, exec = run command (the rest of the line) using shell
#
# These lines can be re-ordered; they are executed from top to bottom.
#
# If you remove a line here THAT COMMIT WILL BE LOST.
#
# However, if you remove everything, the rebase will be aborted.
#
# Note that empty commits are commented out
I've decided to squash 46908ce, so that this commit will be combined with its previous commit, the 49115c3.

I can even change the commit message and combine the two commit messages!
# This is a combination of 2 commits.
# The first commit's message is:

BUGS-0010 Implement configuration using Gulp.

# This is the 2nd commit message:

BUGS-0010 Two ways: "gulp test" or "gulp production".

# Please enter the commit message for your changes. Lines starting
# with '#' will be ignored, and an empty message aborts the commit.
#
# Date:      Thu Jan 19 14:13:50 2017 +0100
#
# rebase in progress; onto 0a98e03
# You are currently editing a commit while rebasing branch 'BUGS-0010' on '0a98e03'.
#
# Changes to be committed:
#       modified:   config.xml
#       modified:   lang.json
# Untracked files:
#       project/config.xml~
#       project/res/
#       project/www/conf/
#       project/www/i18n/
#
The output will look something like this:
[detached HEAD 5b2f880] BUGS-0010 Configuration using gulp in two ways: "gulp test" or "gulp production".
Date: Mon Jan 9 13:20:45 2017 +0100
8 files changed, 36 insertions(+), 6 deletions(-)
rename project/{ => conf}/config.xml (94%)
rename project/{www/i18n => conf}/lang.json (99%)

# Reordering checkins

Reordering the checkins, is as simple as cut&pasting the appropriate lines into a different sequence.

# Removing checkins

Removing a checkin, can be done by just removing a line from the sequence. Use with caution.

# References

Atlassian Git Tutorial - Rewriting history
https://www.atlassian.com/git/tutorials/rewriting-history/git-rebase-i
Git - 7.6 Git Tools - Rewriting History
https://git-scm.com/book/en/v2/Git-Tools-Rewriting-History

## Thursday, 12 January 2017

### Base 64 Encoding and URLs

I recently had some issues with base64 encoding of images and documents, prior to sending them over HTTP to the frontend of my app, and there decoding it again.

The issues were manifold.

Let me try and indicate the problems I encountered.

In the backend I use the BaseEncoding class provided in the Core of Google Guava. (com.google.common.io.BaseEncoding) The method "base64()"speaks for itself.

At the frontend, using Javascript, I used the "atob()" method1 to get the whole base64 encoded string back into it's original shape.

# Problem 1 - String contains an invalid character

The "atob()" method threw an error, when attempting to decode String. After some research, involving comparing the string that is sent by the Backend, with the string received by the Frontend, I did notice two differences.

Apparently, the backend is sending a url-safe encoded string2, despite me not having specified that this is
what I want.

Well, the differences aren't major and a simple solution does the trick:
atob(contents.replace(/-/g, "+").replace(/_/g, "/"));
And voila, atob() no longer complains about invalid characters.

# Problem 2 - Decoded document does not match original document

My app stored the received document, after decoding, into a file locally, so it can be opened. My App is created using Cordova (and Ionic and some other stuff) and I use the Cordova File Plugin3 to write the file.

The PDF document that I was using as a test, seemed to be transferred just fine, but upon opening it on the Tablet, an empty PDF document was shown.

There were vast differences between the original document and the decoded document. The differences seem to focus on the decidedly "weird" characters. The alphabet seemed just fine.

Being at a loss for the moment, I decided to use a library named js-base644. That didn't help in the slightest. Using runkit5 to test it, I found that it actually decodes base64 encoding into UTF-8 badly.

There were a number of bugs reported with it in GitHub.

# Problem 3 - Cordova File Plugin

After switching back to the method "atob()", and comparing the output of "atob()" with the original document, I found them to be identical.

However, the file stored on the Tablet was still suffering from the exact same symptoms. Clearly something was going wrong with the Cordova File Plugin.

After looking at the documentation3, I found that the File Plugin will also output UTF-8, similar to the js-base64 library.

In the end, I found out that it only outputs UTF-8, if I write a string in a Blob to the File. If I change what I write into a JavaScript ArrayBuffer in a Blob, things work as they should.

And I finally got a nice PDF in the standard PDF Viewer of the Tablet.

# References

[1] MDN - Base64 encoding and decoding
https://developer.mozilla.org/en/docs/Web/API/WindowBase64/Base64_encoding_and_decoding
[2] RFC4648 The Base16, Base32, and Base64 Data Encodings - Section 5
https://tools.ietf.org/html/rfc4648#section-5
[3] File - Apache Cordova - cordova-plugin-file
https://cordova.apache.org/docs/en/latest/reference/cordova-plugin-file/
[4] js-base64 - Yet another Base64 transcoder in pure JS
https://www.npmjs.com/package/js-base64
[5] Runkit
https://runkit.com/npm/js-base64
StackOverflow - Using Javascript's atob to decode base64 doesn't properly decode utf-8 strings
http://stackoverflow.com/questions/30106476/using-javascripts-atob-to-decode-base64-doesnt-properly-decode-utf-8-strings
C#411 - Convert Binary to Base64 String
http://www.csharp411.com/convert-binary-to-base64-string/

## Thursday, 5 January 2017

### Inkscape

Inkscape is a vector graphics drawing tool for Linux, which I've used in the past.

Fedora magazine has a series of articles on it, which I link to here.
Part 1 - Getting started with Inkscape on Fedora
https://fedoramagazine.org/getting-started-inkscape-fedora/
Part 2 - Inkscape: Adding some colour