Sixth Week [June 24th - June 30th] [3rd PSF Weekly Check-in]

Youssef15015
Published: 07/01/2019

1. I setup up testing and build documentation pipelines for a different neighboring repository on my own forked repository: https://github.com/Youssef15015/carsus. This repository is intended to complement github.com/tardis-sn/tardis, as it stores the atomic data information of elements. The pipeline is much more advanced, as it runs 5 parallel tests, a simple basic pytest, one that uses remote data, one that runs slowly, one that fetches database info, and one that adds coverage results: https://dev.azure.com/joey070/joey070/_build/results?buildId=419&view=codecoverage-tab . I learned three new things while making this pipeline: how to effectively use key value pairs to assign different outputs for the same variable for each parralel job and how to use azure's conditioning statements, and how to publish coverage results. 

2. After this, the goal is to focus on pipelines that run on self hosted agents. I beleive the idea is to enable the Dask framework for parallel execution for TARDIS. So, we can run many iterations of our simulations with different inputs automatically.

3. A problem I faced, which I condisder to be the most difficult thus far in the program, is after unpinning sphinx in the carsus environment for making the documenation for python 3, the tests were no longer running. As I receive this error: RecursionError: maximum recursion depth exceeded while calling a Python object. I can simply increase the maximum recursion depth, but that is frown upon. The two main dependecies to investegate are sphinx and astropy_helpers. The recursion happens within astropy_helpers, but it is related to sphinx, and I cannot do anything about sphinx, so evidently I will try to either find the right astropy_helpers version or change something about the git sub module we use.