Recently I was in the situation where I switched from my existing mail provider to Gmail. But for some reason, the migration option in Gmail did not want to connect to my old mail account. But I found an ok way to transfer the emails anyway. This article contains a guide for how to do it.
When you have worked in a specific area for a long time, many of the tasks begin to be repetitive. One way to keep sharp is to engage the brain with programming puzzles. A site that contains many puzzles is hackerrank.com. The name is a bit over the top for my taste, but they keep an extensive list of many different programming challenges.
Each challenge is solved in the programming language you choose. It makes a great way to both keep the skills up to date and practice using programming languages that you do not use that often. Especially the challenges that are rated medium and up are good for keeping your skills sharp.
I have tried to tackle one of the most difficult challenges; spy game revisited if you keep on reading there will be spoilers so if you want to solve the problem your self, please stop reading. I will implement the solution using C# .NET. You can find the code here.
Redundant PHP-fpm service
Now the service stack has a load balancer, redundant Nginx web servers, but the PHP-fpm server is still only a single service. Most of the processing happens in the PHP-fpm server when serving a page request. The part prohibiting us from replicating the PHP-fpm service is that session data is stored in the local filesystem on the server. So if we just replicated the PHP-fpm server without replicating the sessions it would not work.
In this part, we will do the necessary changes to support a redundant PHP-fpm service.
When testing we mostly think about unit-testing. Even though the lines are a bit fuzzy most agree that a unit-test needs to run without any external dependencies and that it must run fast. In most cases, a unit is a single method or class that we test to see if it gives the expected output. If the unit has dependencies they should be exchanged for a test double/mock to make sure that we only test the logic inside the unit and not the workings of the dependencies.
But we can’t test everything using a unit-test. To expand our test coverage, one way to go is to use integration-tests. When using an integration we want to test how a unit interacts with other units or external dependencies.
Finance has always fascinated me. It is ripe with mathematics, very hands-on, it has a global marketplace, the assets are valued all the time. Other interesting aspects are big data, complex relations and the possibility for endless challenges as the market evolves. It is a field perfect for trying out machine learning technology, and who knows maybe hit jackpot if the findings are profitable. But that is not an initial goal.
The goal for me is to set up a platform that allows me to build different trading algorithms and evaluate them.
Initially(this article), I want to
- Find a python library to support building and backtesting algorithms
- Setup an evaluation method to evaluate the performance of a strategy
- Construct a simple trading algorithm to showcase the evaluation
- Run the system on my own laptop on demand
Further down the line I want to
- Have a system that can generate trading signals in different markets
- Run the system on AWS and update automatically
- Have a web frontend which shows the performance of the algorithm(s) and the signals
- Have the algorithms connected to a real account to do automatic trading – far into the future
Of course, this is not an exhaustive list, and many more aspects of it will, without a doubt pop up. So keep reading.
In this part the original thought was to setup the php-fpm server to be redundant and fix the problem with the db backup not running. It ended up being more of a cleanup of the setup. But I did learn many things about docker in the process.
We will cover the following things
- How to remove a service from the docker swarm
- Setting up a job scheduler in docker to run the backup jobs, for both files and database
I just viewed a webinar from Nasdaq which talks about using sentiment analysis to predict price movements in stocks. You can find the webinar here, very interesting subject. The presenter shows that the sentiment in many cases are an early predictor of the price movement. Of course the webinar is also a sales pitch for the new analytics hub that Nasdaq has build which currently consist of nine datasets, one of them are the sentiment data. All the nine datasets are in the group of “alternative data” which is all the new rage in the financial sector.
Read more to get an overview of the key points from the webinar and a few my takes on pitfalls in this area and how to do similar sentiment analysis on you own.
The setup needs to be able to scale better than it is capable of currently. Right now the http1 and http2 are defined as two services instead of the same service with two replicas. Also the build process is a bit slow, primarily because it needs to compile php for every build. I did not succeed with this part, but I did learn a lot about the docker build cache. More about that later. I also managed to fix a few other things that bothered me. Like the php file upload limit, and updating wordpress since v4.9.1 was released since my previous deployment.
I think that giving readers a possibility to comment on articles allow for a more dynamic discussion. But I have never really liked the native commenting system in wordpress. So I looked at other systems.
By combining the power of docker and python I can create an analytics platform that will always run and is not dependent on the versions of python or anything other I run on my laptop.
I will show how easy it is to setup jupyter notebooks and use them for analytics. And how easy it is to publish an analysis to this blog. Docker gives me at least two benefits, 1. I can be sure that when I start my docker image the analysis will always run because docker makes sure the versions of all the software are the same. 2. If my laptop is not enough to run an analysis I can deploy the same docker image to a much more power full computer in a cloud.