visit
The optimization of quality over time is called an “adaptivity” in this context.
IQR explanation
2. If the Renko chart is not empty then get the last market price and add to the Renko. If the new brick is not built then pass to the next iteration. Otherwise, if the new brick is following the current direction then part of current position should be covered (0 – 100%). If the new brick is built in the opposite direction then the current position should be closed, it means that trend has been changed. The Renko chart should be empty.
3. Repeat these steps while the price data continues to feed.Catalyst is an algorithmic trading library for crypto-assets written in Python. It allows trading strategies to be easily expressed and backtested against historical data (with daily and minute resolution), providing analytics and insights regarding a particular strategy’s performance.
Basically, Catalyst script consists of a few parts: initialize, handle_data, analyze, and run_algorithm execution. Let’s code the algorithm.
First of all, required libraries should be specified, pyrenko module you can find on .
<a href="//medium.com/media/459ed8059d09005143f37019720a4acc/href">//medium.com/media/459ed8059d09005143f37019720a4acc/href</a> Some information from tutorial:Every catalyst algorithm consists of at least two functions you have to define:
initialize(context)
handle_data(context, data)
Before the start of the algorithm, catalyst calls the initialize() function and passes in a context variable. context is a persistent namespace for you to store variables you need to access from one algorithm iteration to the next.
After the algorithm has been initialized, catalyst calls the handle_data() function on each iteration, that’s one per day (daily) or once every minute (minute), depending on the frequency we choose to run our simulation. On every iteration, handle_data() passes the same contextvariable and an event-frame called data containing the current trading bar with open, high, low, and close (OHLC) prices as well as volume for each crypto asset in your universe.
Our initialize function looks like this:
<a href="//medium.com/media/62a936cecdb61ddc157729acdd0f536e/href">//medium.com/media/62a936cecdb61ddc157729acdd0f536e/href</a>We work with ETH/BTC crypto pair. The basic timeframe is hourly (60T). The Renko chart uses 15 (15 * 24 hours)days of data. We cover 16.6% of the position amount after each new brick in the current direction. Commission are similar to commission on Bitfinex exchange. Also, we use the slippage value, it looks more how it will be going in real mode.
The general logic of algorithm is in handle_data function:
<a href="//medium.com/media/dc7127a603afb6ebfc442ec32000bc49/href">//medium.com/media/dc7127a603afb6ebfc442ec32000bc49/href</a> Firstly, we check if the model is empty we should get the data, calculate IQR, optimize the brick size, build the Renko chart, and open the order. Otherwise, we get the last price and put it to Renko chart. Then check how much new brick size built and what is the direction of the last brick. Each block of the code contains a comment, this helps you to match code and algorithm.Additional information has been passed using record function. This information used in analyze function that runs after algorithm execution. In this function, we can draw some graphs, calculate the performance of the algorithm, and etc. Variable perf contains basic information of the performance, also this variable contains information that we added using record function.
<a href="//medium.com/media/8c4ee4436f08a74f926e936fc33c2938/href">//medium.com/media/8c4ee4436f08a74f926e936fc33c2938/href</a>The last part of the script is run_algorithm that contains a period of the backtesting, capital, cryptocurrency, and names of the functions that we described above.
<a href="//medium.com/media/695ab36d5f02c1841a03ccd62ea2ce27/href">//medium.com/media/695ab36d5f02c1841a03ccd62ea2ce27/href</a>In this example, we work with daily data feeding (data_frequency parameter).
Terminal window after launching the script
Basic metrics you can find in the output of the terminal window, these metrics we output in analyze function. Total return of algorithm is 252.55% with -18.74% maximum drawdown. This is not bad for almost 1 year. You can use Sortino ratio for comparing different algorithms, I considered this metric in this . Beta is very close to 0 and Alpha is positive, it means that our algorithm is market-neutral and we beat the benchmark. If you are not familiar with this metrics I recommend you this article.
Result graphs in Catalyst
The blue line on the first graph is an equity of the algorithm, the red line is an equity of the benchmark (ETH/BTC asset). The second graph contains the price of ETH/BTC (grey color) and Renko price (yellow color). Brick size is shown on the third graph (blue color), vertical red lines are time moments when the Renko chart was rebuilt. The number of created Renko bricks is shown on the fourth graph, the position amount is shown on the fifth plot. The last graph contains a drawdown.
Let’s get an additional information based on our result. The further analysis uses perf variable in csv-format. I use library for this purpose.
It is a Python library for performance and risk analysis of financial portfolios developed by Quantopian Inc. It works well with the Zipline open source backtesting library.First of all, draw the returns of algorithm and compare the equity with benchmark:
Equities
Algorithm returns
Summary stats
Summary statistics contains basic metrics, some of them we got in the output of analyze function. This variant is more extended, metrics such as Daily value at risk or Annual volatility could be very useful in strategy evaluating.
The next graphs describe drawdown of our strategy:Drawdown
Top-5 drawdown periods Drawdown is one of the key metric to estimate a reliable of the strategy, also the achievement of critic level of a drawdown could be a trigger to re-optimize the strategy.
Top-5 drawdown periods table These graphs describe our returns by different angles: distribution of monthly returns and box-plots of returns for different timeframes (daily, weekly, monthly):
Distribution of monthly returns
Box-plots of returns Let’s look at volatility of algorithm as a monthly moving average:
Moving average of volatility Decreasing of Sharpe ratio (e.g. negative) also could be a trigger for re-optimizing process in the lifetime pipeline:
Moving average of Sharpe ratio