Ionut Indre

Application Logging is a two-edged sword

If you have already spent some time in the IT development industry you already know that good logging can save you from a lot of nasty situations and it gives you insights about problems and errors that may have occurred. Basically, it makes the entire process of debugging applications easier for you since you already know where and how to look at the issues.

Most of the times good logging makes a huge impact on response time of fixing faulty deploys or simply it gives to the developer useful information on how to fix existing issues.

So, we all started, a couple of years ago, to add file logging to our applications but as soon as we started to scale on premise servers to more instances we realized that tracing and analyzing of logging files can take a lot of time and it is not a healthy process for anyone.

So, the next logical step, for all of us, was to find an aggregator of all that information and a better way to analyze and look at the data.

Thanks to the ELK Stack (Elasticsearch, Logstash and Kibana) we found an easy and elegant way to take care and fix this impediment despite the server’s nature (cloud or on premise) or the number of instances.

Now everything seems to be blissful, but sometimes we find later on that our applications are not that smart as we thought in the beginning. The process of looking and manually analyzing the log files on multiple servers can lead to a lot of human errors and some issues or logged data can easily be overlooked.

Usually our applications are following a path of continuous iterating process where we develop new features, extend the current ones and also add extra logging. As I emphasized before, the process of monitoring and analyzing the logged data on multiple servers is not the simplest one and since we are in the development phase, we tend to add logging in all the places, just to be sure that the maintenance and debugging process is easier later on.

This tendency combined with the old way of logging into files lead to a very strange situation, that I have encountered, where personal data and credentials of a significant number of users were logged into the log files. This wonderful discovery was easy to see after all the file logging was changed to a new and modern way using ELK Stack and some ugly details has been revealed.

What I really want to point out is that proper logging is a crucial and mandatory process for all the applications, but we must pay attention to what we are logging and where. Sometimes we may end up into a situation where we are logging data that we mustn’t which could later violate the GDPR compliances or even worse expose users in case of a data breach.

Thus let’s take care of our customers and end users by offering them an elegant and secure service to use where information is not leaked and where we are storing just what is needed!


Ionut Indre was born and raised in Romania but currently he is discovering the surroundings of Sweden, Stockholm. He is a software engineer which is involved in the entire process of development with a special interest for Cyber-Security.