Articles on joaodellagli's Bloghttps://blogs.python-gsoc.orgUpdates on different articles published on joaodellagli's BlogenThu, 31 Aug 2023 14:40:56 +0000Google Summer of Code Final Work Producthttps://blogs.python-gsoc.org/en/joaodellaglis-blog/google-summer-of-code-final-work-product-5/<p><a href="https://summerofcode.withgoogle.com/programs/2023/projects/ED0203De"><img alt="gsoc" height="45" src="https://camo.githubusercontent.com/976f3fc0f6caea7a0818d893dd354096e2119daa4d2d761f54cf0eb428617962/68747470733a2f2f646576656c6f706572732e676f6f676c652e636f6d2f6f70656e2d736f757263652f67736f632f7265736f75726365732f646f776e6c6f6164732f47536f432d6c6f676f2d686f72697a6f6e74616c2e737667"></a> <a href="https://summerofcode.withgoogle.com/programs/2023/organizations/python-software-foundation"><img alt="" height="45" src="https://camo.githubusercontent.com/fb9e18f7ca2a84b570706b0c725d31f01587805c6bc02814986fcd31f17a7d64/68747470733a2f2f7777772e707974686f6e2e6f72672f7374617469632f696d672f707974686f6e2d6c6f676f4032782e706e67"></a> <a href="https://fury.gl/latest/community.html"><img alt="fury" height="50" src="https://camo.githubusercontent.com/9075ad7a698be85ef7bd9a195a4a4feb6eb9f5448aaa0e35dce3e765850c124c/68747470733a2f2f707974686f6e2d67736f632e6f72672f6c6f676f732f667572795f6c6f676f2e706e67"></a></p> <h1>Google Summer of Code Final Work Product</h1> <ul> <li><strong>Name:</strong> João Victor Dell Agli Floriano</li> <li><strong>Organisation:</strong> Python Software Foundation</li> <li><strong>Sub-Organisation:</strong> FURY</li> <li style="text-align: justify;"><strong>Project:</strong> <a href="https://github.com/fury-gl/fury/wiki/Google-Summer-of-Code-2023-(GSOC2023)#project-2-fast-3d-kernel-based-density-rendering-using-billboards">FURY - Project 2. Fast 3D kernel-based density rendering using billboards.</a></li> </ul> <h2>Abstract</h2> <p>This project had the goal to implement a 3D Kernel Density Estimation rendering to Fury. Kernel Density Estimation, or KDE, is a statistical method that uses kernel smoothing for modeling and estimating the density distribution of a set of points defined inside a given region. For its graphical implementation, it was used post-processing techniques such as offscreen rendering to framebuffers and colormap post-processing as tools to achieve the desired results. This was completed with a functional basic KDE rendering result, that relies on a solid and easy-to-use API, as well as some additional features.</p> <h2>Proposed Objectives</h2> <ul> <li> <p><strong>First Phase</strong> : Implement framebuffer usage in FURY</p> <ul> <li>Investigate the usage of float framebuffers inside FURY's environment.</li> <li>Implement a float framebuffer API.</li> </ul> </li> <li> <p><strong>Second Phase</strong> : Shader-framebuffer integration</p> <ul> <li>Implement a shader that uses a colormap to render framebuffers.</li> <li>Escalate this rendering for composing multiple framebuffers.</li> </ul> </li> <li> <p><strong>Third Phase</strong> : KDE Calculations</p> <ul> <li>Investigate KDE calculation for point-cloud datasets.</li> <li>Implement KDE calculation inside the framebuffer rendering shaders.</li> <li>Test KDE for multiple datasets.</li> </ul> </li> </ul> <h2>Objectives Completed</h2> <ul> <li> <p><strong>Implement framebuffer usage in FURY</strong></p> <p>The first phase, addressed from <em>May/29</em> to <em>July/07</em>, started with the investigation of <a href="https://vtk.org/doc/nightly/html/classvtkOpenGLFramebufferObject.html#details%3E">VTK's Frambuffer Object</a>, a vital part of this project, to understand how to use it properly.</p> <p>Framebuffer Objects, abbreviated as FBOs, are the key to post-processing effects in OpenGL, as they are used to render things offscreen and save the resulting image to a texture that will be later used to apply the desired post-processing effects within the object's <a href="https://www.khronos.org/opengl/wiki/Fragment_Shader">fragment shader</a> rendered to screen, in this case, a <a href="http://www.opengl-tutorial.org/intermediate-tutorials/billboards-particles/billboards/">billboard</a>. In the case of the <a href="https://en.wikipedia.org/wiki/Kernel_density_estimation">Kernel Density Estimation</a> post-processing effect, we need a special kind of FBO, one that stores textures' values as floats, different from the standard 8-bit unsigned int storage. This is necessary because the KDE rendering involves rendering every KDE point calculation to separate billboards, rendered to the same scene, which will have their intensities, divided by the number of points rendered, blended with <a href="https://www.khronos.org/opengl/wiki/Blending">OpenGL Additive Blending</a>, and if a relative big number of points are rendered at the same time, 32-bit float precision is needed to guarantee that small-intensity values will not be capped to zero, and disappear.</p> <p>After a month going through VTK's FBO documentation and weeks spent trying different approaches to this method, it would not work properly, as some details seemed to be missing from the documentation, and asking the community haven't solved the problem as well. Reporting that to my mentors, which unsuccessfully tried themselves to make it work, they decided it was better if another path was taken, as the 32-bit float FBO would result in a much more complicated pipeline, using <a href="https://vtk.org/doc/nightly/html/classvtkWindowToImageFilter.html">VTK's WindowToImageFilter</a> method as a workaround, described in this <a href="https://fury.gl/latest/posts/2023/2023-07-03-week-5-joaodellagli.html">blogpost</a>. This method helped the development of three new functions to Fury, <code>window_to_texture()</code>, <code>texture_to_actor()</code> and <code>colormap_to_texture()</code>, that allow the passing of different kinds of textures to Fury's actor's shaders, the first one to capture a window and pass it as a texture to an actor, the second one to pass an external texture to an actor, and the third one to specifically pass a colormap as a texture to an actor. It is important to say that <code>WindowToImageFilter()</code> is not the ideal way to make it work, as this method does not seem to support float textures. However, a workaround to that is currently being worked on, as I will describe later on.</p> <p><em>Pull Requests:</em></p> <ul> <li> <p><strong>KDE Rendering Experimental Program (needs major revision):</strong> <a href="https://github.com/fury-gl/fury/pull/804">fury-gl/fury#804</a></p> <p>The result of this whole FBO and WindowToImageFilter experimentation is well documented in PR <a href="https://github.com/fury-gl/fury/pull/804">#804</a> that implements an experimental version of a KDE rendering program. The future of this PR, as discussed with my mentors, is to be better documented to be used as an example for developers on how to develop features in Fury with the tools used, and it shall be done soon.</p> </li> </ul> </li> <li> <p><strong>Shader-framebuffer integration</strong></p> <p>The second phase, which initially was thought of as "Implement a shader that uses a colormap to render framebuffers" and "Escalate this rendering for composing multiple framebuffers" was actually a pretty simple phase that could be addressed in one week, <em>July/10</em> to <em>July/17</em>, done at the same time as the third phase goal, documented in this <a href="https://fury.gl/latest/posts/2023/2023-07-17-week-7-joaodellagli.html">blogpost</a>. As Fury already had a tool for generating and using colormaps, they were simply connected to the shader part of the program as textures, with the functions explained above. Below, is the result of the <code>matplotlib viridis</code> colormap passed to a simple gaussian KDE render:</p> <p><a href="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/final_2d_plot.png"><img alt="Final 2D Plot" height="576" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/final_2d_plot.png" width="668"></a></p> <p>That is also included in PR <a href="https://github.com/fury-gl/fury/pull/804">#804</a>. Having the 2D plot ready, some time was taken to figure out how to enable a 3D render, that includes rotation and other movement around the set rendered, which was solved by learning about the callback properties that exist inside <code>VTK</code>. Callbacks are ways to enable code execution inside the VTK rendering loop, enclosed inside <code>vtkRenderWindowInteractor.start()</code>. If it is desired to add a piece of code that, for example, passes a time variable to the fragment shader over time, a callback function can be declared:</p> <pre><code>from fury import window t = 0 showm = window.ShowManager(...) def callback_function: t += 0.01 pass_shader_uniforms_to_fs(t, "t") showm.add_iren_callback(callback_function, "RenderEvent") </code></pre> <p>The piece of code above created a function that updates the time variable <code>t</code> in every <code>"RenderEvent"</code>, and passes it to the fragment shader. With that property, the camera and some other parameters could be updated, which enabled 3D visualization, that then, outputted the following result, using <code>matplotlib inferno</code> colormap:</p> <p><a href="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/3d_kde_gif.gif"><img alt="3D KDE Plot" height="657" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/3d_kde_gif.gif" width="657"></a></p> </li> <li> <p><strong>KDE Calculations</strong> (ongoing)</p> <p>As said before, the second and third phases were done simultaneously, so after having a way to capture the window and use it as a texture ready, the colormap ready, and an initial KDE render ready, all it was needed to do was to improve the KDE calculations. As this <a href="https://en.wikipedia.org/wiki/Kernel_density_estimation">Wikipedia page</a> explains, a KDE calculation is to estimate an abstract density around a set of points defined inside a given region with a kernel, that is a function that models the density around a point based on its associated distribution sigma.</p> <p>A well-known kernel is, for example, the <strong>Gaussian Kernel</strong>, that says that the density around a point p with distribution sigma is defined as:</p> <p><img alt="" height="88" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/latex1.png" width="344"></p> <p>Using that kernel, we can calculate the KDE of a set of points A with associated distributions S calculating their individual Gaussian distributions, summing them up and dividing them by the total number of points p:</p> <p><img alt="" height="86" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/latex2.png" width="364"></p> <p>So I dove into implementing all of that into the offscreen rendering part, and that is when the lack of a float framebuffer would charge its cost. As it can be seen above, just calculating each point's density isn't the whole part, as I also need to divide everyone by the total number of points p, and then sum them all. The problem is that, if the number of points its big enough, the individual densities will be really low, and that would not be a problem for a 32-bit precision float framebuffer, but that is <em>definitely</em> a problem for a 8-bit integer framebuffer, as small enough values will simply underflow and disappear. That issue is currently under investigation, and some solutions have already been presented, as I will show in the <strong>Objectives in Progress</strong> section.</p> <p>Apart from that, after having the experimental program ready, I focused on modularizing it into a functional and simple API (without the n division for now), and I could get a good set of results from that. The API I first developed implemented the <code>EffectManager</code> class, responsible for managing all of the behind-the-scenes steps necessary for the kde render to work, encapsulated inside the <code>ÈffectManager.kde()</code> method. It had the following look:</p> <pre><code>from fury.effect_manager import EffectManager from fury import window showm = window.ShowManager(...) # KDE rendering setup em = EffectManager(showm) kde_actor = em.kde(...) # End of KDE rendering setup showmn.scene.add(kde_actor) showm.start() </code></pre> <p>Those straightforward instructions, that hid several lines of code and setup, could manage to output the following result:</p> <p><a href="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/fianl_3d_plot.png"><img alt="API 3D KDE plot" height="666" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/fianl_3d_plot.png" width="652"></a></p> <p>And this was not the only feature I had implemented for this API, as the use of <code>WindowToImageFilter</code> method opened doors for a whole new world for Fury: The world of post-processing effects. With this features setup, I managed to implement a <em>Gaussian blur</em> effect, a <em>grayscale</em> effect and a <em>Laplacian</em> effect for calculating "borders":</p> <p><a href="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/gaussian_blur.png"><img alt="Gaussian Blur effect" height="382" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/gaussian_blur.png" width="644"></a></p> <p><a href="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/grayscale.png"><img alt="Grayscale effect" height="371" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/grayscale.png" width="642"></a></p> <p><a href="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/laplacian1.gif"><img alt="Laplacian effect" height="641" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/laplacian1.gif" width="641"></a></p> <p>As this wasn't the initial goal of the project and I still had several issues to deal with, I have decided to leave these features as a future addition.</p> <p>Talking with my mentors, we realized that the first KDE API, even though simple, could lead to bad usage from users, as the <code>em.kde()</code> method, that outputted a <em>Fury actor</em>, had dependencies different from any other object of its kind, making it a new class of actors, which could lead to confusion and bad handling. After some pair programming sessions, they instructed me to take a similar, but different road from what I was doing, turning the kde actor into a new class, the <code>KDE</code> class. This class would have almost the same set of instructions present in the prior method, but it would break them in a way it would only be completely set up after being passed to the <code>EffectManager</code> via its add function. Below, how the refactoring handles it:</p> <pre><code>from fury.effects import EffectManager, KDE from fury import window showm = window.ShowManager(...) # KDE rendering setup em = EffectManager(showm) kde_effect = KDE(...) em.add(kde_effect) # End of KDE rendering setup showm.start() </code></pre> <p>Which outputted the same results as shown above. It may have cost some simplicity as we are now one line farther from having it working, but it is more explicit in telling the user this is not just a normal actor.</p> <p>Another detail I worked on was the kernel variety. The Gaussian Kernel isn't the only one available to model density distributions, there are several others that can do that job, as it can be seen in this <a href="https://scikit-learn.org/stable/modules/density.html">scikit-learn piece of documentation</a> and this <a href="https://en.wikipedia.org/wiki/Kernel_(statistics)">Wikipedia page on kernels</a>. Based on the scikit-learn KDE implementation, I worked on implementing the following kernels inside our API, that can be chosen as a parameter when calling the <code>KDE</code> class:</p> <ul> <li>Cosine</li> <li>Epanechnikov</li> <li>Exponential</li> <li>Gaussian</li> <li>Linear</li> <li>Tophat</li> </ul> <p>Below, the comparison between them using the same set of points and bandwidths:</p> <p><a href="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/kernels.png"><img alt="Comparison between the six implemented kernels" height="413" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/kernels.png" width="617"></a></p> <p><em>Pull Requests</em>:</p> <ul> <li> <p><strong>First Stage of the KDE Rendering API (will be merged soon)</strong> <a href="https://github.com/fury-gl/fury/pull/826">fury-gl/fury#826</a></p> <p>All of this work culminated in PR <a href="https://github.com/fury-gl/fury/pull/826/">#826</a>, that proposes to add the first stage of this API (there are some details yet to be completed, like the n division) to Fury. This PR added the described API, and also proposed some minor changes to some already existing Fury functions related to callbacks, changes necessary for this and other future applications that would use it to work. It also added the six kernels described, and a simple documented example on how to use this feature.</p> </li> </ul> </li> </ul> <h2>Other Objectives</h2> <ul> <li><strong>Stretch Goals</strong> : SDE Implementation, Network/Graph visualization using SDE/KDE, Tutorials <ul> <li>Investigate SDE calculation for surface datasets.</li> <li>Implement SDE calculation inside the framebuffer rendering shaders.</li> <li>Test SDE for multiple datasets.</li> <li>Develop comprehensive tutorials that explain SDE concepts and FURY API usage.</li> <li>Create practical, scenario-based tutorials using real datasets and/or simulations.</li> </ul> </li> </ul> <h2>Objectives in Progress</h2> <ul> <li> <p><strong>KDE Calculations</strong> (ongoing)</p> <p>The KDE rendering, even though almost complete, have the n division, an important step, missing, as this normalization allows colormaps to cover the whole range o values rendered. The lack of a float FBO made a big difference in the project, as the search for a functional implementation of it not only delayed the project, but it is vital for the correct calculations to work.</p> <p>For the last part, a workaround thought was to try an approach I later figured out is an old one, as it can be check in <a href="https://developer.nvidia.com/gpugems/gpugems/part-ii-lighting-and-shadows/chapter-12-omnidirectional-shadow-mapping">GPU Gems 12.3.3 section</a>: If I need 32-bit float precision and I got 4 8-bit integer precision available, why not trying to pack this float into this RGBA texture? I have first tried to do one myself, but it didn't work for some reason, so I tried <a href="https://aras-p.info/blog/2009/07/30/encoding-floats-to-rgba-the-final/">Aras Pranckevičius</a> implementation, that does the following:</p> <pre><code>vec4 float_to_rgba(float value) { vec4 bitEnc = vec4(1.,256.,65536.0,16777216.0); vec4 enc = bitEnc * value; enc = fract(enc); enc -= enc.yzww * vec2(1./255., 0.).xxxy; return enc; } </code></pre> <p>That initially worked, but for some reason I am still trying to understand, it is resulting in a really noisy texture:</p> <p><a href="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/noisy%20kde.png"><img alt="Noisy KDE render" height="563" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/noisy%20kde.png" width="641"></a></p> <p>One way to try to mitigate that while is to pass this by a gaussian blur filter, to try to smooth out the result:</p> <p><a href="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/blurred_kde.png"><img alt="Blurred result" height="450" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/blurred_kde.png" width="644"></a></p> <p>But it is not an ideal solution as well, as it may lead to distortions in the actual density values, depending on the application of the KDE. Now, my goal is to first find the root of the noise problem, and then, if that does not work, try to make the gaussian filter work.</p> <p>Another detail that would be a good addition to the API is UI controls. Filipi, one of my mentors, told me it would be a good feature if the user could control the intensities of the bandwidths for a better structural visualization of the render, and knowing Fury already have a good set of <a href="https://fury.gl/latest/auto_examples/index.html#user-interface-elements">UI elements</a>, I just neeeded to integrate that into my program via callbacks. I tried implementing an intensity slider. However, for some reason, it is making the program crash randomly, for reasons I still don't know, so that is another issue under investigation. Below, we show a first version of that feature, which was working before the crashes:</p> <p><a href="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/slider.gif"><img alt="Slider for bandwidths" height="461" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/slider.gif" width="461"></a></p> <p><em>Pull Requests:</em></p> <ul> <li><strong>UI intensity slider for the KDE rendering API (draft)</strong>: <a href="https://github.com/fury-gl/fury/pull/849">fury-gl/fury#849</a></li> <li><strong>Post-processing effects for FURY Effects API (draft)</strong>: <a href="https://github.com/fury-gl/fury/pull/850">fury-gl/fury#850</a></li> </ul> </li> </ul> <h2>GSoC Weekly Blogs</h2> <ul> <li>My blog posts can be found at <a href="https://fury.gl/latest/blog/author/joao-victor-dell-agli-floriano.html">FURY website</a> and <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/">Python GSoC blog</a>.</li> </ul> <h2>Timeline</h2> <table> <tbody> <tr> <th>Date</th> <th>Description</th> <th>Blog Link</th> </tr> </tbody> <tbody> <tr> <td>Week 0 (29-05-2023)</td> <td>The Beginning of Everything</td> <td><a href="https://fury.gl/latest/posts/2023/2023-05-29-week-0-joaodellagli.html">FURY</a> - <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/the-beggining-of-everything-week-0/">Python</a></td> </tr> <tr> <td>Week 1 (05-06-2022)</td> <td>The FBO Saga</td> <td><a href="https://fury.gl/latest/posts/2023/2023-06-05-week-1-joaodellagli.html">FURY</a> - <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/the-fbo-saga-week-1/">Python</a></td> </tr> <tr> <td>Week 2 (12-06-2022)</td> <td>The Importance of (good) Documentation</td> <td><a href="https://fury.gl/latest/posts/2023/2023-06-12-week-2-joaodellagli.html">FURY</a> - <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/the-importance-of-good-documentation-week-2/">Python</a></td> </tr> <tr> <td>Week 3 (19-06-2022)</td> <td>Watch Your Expectations</td> <td><a href="https://fury.gl/latest/posts/2023/2023-06-19-week-3-joaodellagli.html">FURY</a> - <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-3-watch-your-expectations/">Python</a></td> </tr> <tr> <td>Week 4 (26-06-2022)</td> <td>Nothing is Ever Lost</td> <td><a href="https://fury.gl/latest/posts/2023/2023-06-26-week-4-joaodellagli.html">FURY</a> - <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-4-nothing-is-ever-lost/">Python</a></td> </tr> <tr> <td>Week 5 (03-07-2022)</td> <td>All Roads Lead to Rome</td> <td><a href="https://fury.gl/latest/posts/2023/2023-07-03-week-5-joaodellagli.html">FURY</a> - <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-5-all-roads-lead-to-rome/">Python</a></td> </tr> <tr> <td>Week 6 (10-07-2022)</td> <td>Things are Starting to Build Up</td> <td><a href="https://fury.gl/latest/posts/2023/2023-07-10-week-6-joaodellagli.html">FURY</a> - <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-6-things-are-starting-to-build-up/">Python</a></td> </tr> <tr> <td>Week 7 (17-07-2022)</td> <td>Experimentation Done</td> <td><a href="https://fury.gl/latest/posts/2023/2023-07-17-week-7-joaodellagli.html">FURY</a> - <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-7-experimentation-done/">Python</a></td> </tr> <tr> <td>Week 8 (24-07-2022)</td> <td>The Birth of a Versatile API</td> <td><a href="https://fury.gl/latest/posts/2023/2023-07-24-week-8-joaodellagli.html">FURY</a> - <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-8-the-birth-of-a-versatile-api/">Python</a></td> </tr> <tr> <td>Week 9 (31-07-2022)</td> <td>It is Polishing Time!</td> <td><a href="https://fury.gl/latest/posts/2023/2023-07-31-week-9-joaodellagli.html">FURY</a> - <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-9-it-is-polishing-time/">Python</a></td> </tr> <tr> <td>Week 10 (07-08-2022)</td> <td>Ready for Review!</td> <td><a href="https://fury.gl/latest/posts/2023/2023-08-07-week-10-joaodellagli.html">FURY</a> - <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/ready-for-review/">Python</a></td> </tr> <tr> <td>Week 11 (14-08-2022)</td> <td>A Refactor is Sometimes Needed</td> <td><a href="https://fury.gl/latest/posts/2023/2023-08-14-week-11-joaodellagli.html">FURY</a> - <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/a-refactor-is-sometimes-needed/">Python</a></td> </tr> <tr> <td>Week 12 (21-08-2022)</td> <td>Now That is (almost) a Wrap!</td> <td><a href="https://fury.gl/latest/posts/2023/2023-08-21-week-12-joaodellagli.html">FURY</a> - <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-12-now-that-is-almost-a-wrap/">Python</a></td> </tr> </tbody> </table> <p> </p>joaovdafloriano@usp.br (joaodellagli)Thu, 31 Aug 2023 14:40:56 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/google-summer-of-code-final-work-product-5/Week 12: Now That is (almost) a Wrap!https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-12-now-that-is-almost-a-wrap/<p>Hello everyone, it's time for another GSoC blogpost! Today, I am going to talk about some minor details I worked on my project.</p> <h2>Last Week's Effort</h2> <p>After the API refactoring done last week, I focused on addressing the reviews I would get from it. The first issues I addressed was related to style, as there were some minor details my GSoC contributors pointed out that needed change. Also, I have addressed an issue I was having with the `typed hint` of one of my functions. Filipi, my mentor, showed me there is a way to have more than one typed hint in the same parameter, all I needed to do was to use the <em>Union </em>class from the <em>typing </em>module, as shown below:</p> <pre><code class="language-python">from typing import Union as tUnion from numpy import ndarray def function(variable : tUnion(float, np.ndarray)): pass</code></pre> <p>Using that, I could set the typedhint of the <em>bandwidth</em> variable to <em>float</em> and <em>np.ndarray</em>.</p> <h2>So how did it go?</h2> <p>All went fine with no difficult at all, thankfully.</p> <h2>The Next Steps</h2> <p>My next plans are, after having PR <a href="https://github.com/fury-gl/fury/pull/826">#826</a> merged, to work on the float encoding issue described in <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-9-it-is-polishing-time/">this blogpost</a>. Also, I plan to tackle the UI idea once again, to see if I can finally give the user a way to control de intensities of the distributions.</p> <p>Wish me luck!</p>joaovdafloriano@usp.br (joaodellagli)Wed, 23 Aug 2023 23:36:19 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-12-now-that-is-almost-a-wrap/A Refactor is Sometimes Neededhttps://blogs.python-gsoc.org/en/joaodellaglis-blog/a-refactor-is-sometimes-needed/<p>Hello everyone, it's time for another weekly blogpost! Today I am going to share some updates on the API refactoring I was working on with my mentors.</p> <h2>Last Week's Effort</h2> <p>As I shared with you <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/ready-for-review/">last week</a>, the first draft of my API was finally ready for review, as I finished tweaking some remaining details missing. I was tasked with finding a goode example of usage of the tools we proposed, and I started to do that, however after testing it with some examples, I figured out some significant bugs were to be fixed. Also, after some reviews and hints from some of my mentors and other GSoC contributors, we realised that some refactoring should be done, mainly focused on avoiding bad API usage from the user.</p> <h2>So how did it go?</h2> <p>Initially, I thought only one bug was the source of the issues the rendering presented, but it turned out to be two, which I will explain further.</p> <p> </p> <p>The first bug was related to scaling and misalignment of the KDE render. The render of the points being post-processed was not only with sizes different from the original set size, but it was also misaligned, making it appear in positions different from the points' original ones. After some time spent, I figured out the bug was related to the texture coordinates I was using. Before, this is how my fragment shader looked:</p> <pre><code class="language-cpp">vec2 res_factor = vec2(res.y/res.x, 1.0); vec2 tex_coords = res_factor*normalizedVertexMCVSOutput.xy*0.5 + 0.5; float intensity = texture(screenTexture, tex_coords).r;</code></pre> <p>It turns out using this texture coordinates for *this case* was not the best choice, as even though it matches the fragment positions, the idea here was to render the offscreen window, which has the same size as the onscreen one, to the billboard actor. With that in mind, I realised the best choice was using texture coordinates that matched the whole screen positions, coordinates that were derived from the <em>gl_FragCoord.xy</em>, being the division of that by the resolution of the screen, for normalization. Below, the change made:</p> <pre><code class="language-cpp">vec2 tex_coords = gl_FragCoord.xy/res; float intensity = texture(screenTexture, tex_coords).r;</code></pre> <p>This change worked initially, although with some problems, that later revealed the resolution of the offscreen window needed to be updated inside the callback function as well. Fixing that, it was perfectly aligned and scaled!</p> <p>The second bug was related with the handling of the bandwidth, former sigma parameter. I realised I wasn't dealing properly with the option of the user passing only one single bandiwidth value being passed, so when trying that, only the first point was being rendered. I also fixed that and it worked, so cheers!</p> <p>As I previously said, the bugs were not the only details I spent my time on last week. Being reviewed, the API design, even though simple, showed itself vulnerable to bad usage from the user side, requiring some changes. The changes suggested by mentors was, to, basically, take the <em>kde</em> method out of the <em>EffectManager</em> class, and create a new class from it inside an <em>effects</em> module, like it was a special effects class. With this change, the KDE setup would go from:</p> <pre><code class="language-python">em = EffectManager(show_manager) kde_actor = em.kde(...) show_manager.scene.add(kde_actor)</code></pre> <p>To:</p> <pre><code class="language-python">em = EffectManager(show_manager) kde_effect = KDE(...) em.add(kde_effect)</code></pre> <p>Not a gain in line shortening, however, a gain in security, as preventing users from misusing the kde_actor. Something worth noting is that I learned how to use the <em>functools.partial</em> function, that allowed me to partially call the callback function with only some parameters passed.</p> <h2>This Week's Goals</h2> <p>Having that refactoring made, now I am awaiting for a second review so we could finally wrap it up and merge the first stage of this API. With that being done, I will write the final report and wrap this all up.</p> <p>Let's get to work!</p> <p> </p>joaovdafloriano@usp.br (joaodellagli)Sat, 19 Aug 2023 00:50:22 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/a-refactor-is-sometimes-needed/Ready for Review!https://blogs.python-gsoc.org/en/joaodellaglis-blog/ready-for-review/<p>Hello everyone, it's time for another weekly blogpost!</p> <h2>Last Week's Effort</h2> <p>After talking with my mentors, I was tasked with getting my API PR <a href="https://github.com/fury-gl/fury/pull/826">#826</a> ready for review, as it still needed some polishing, and the most important of all, it needed its tests working, as this was something I haven't invested time since its creation. Having that in mind, I have spent the whole week cleaning whatever needed, writing the tests, and also writing a simple example of its usage. I also tried implementing a little piece of UI so the user could control the intensity of the bandwidth of the KDE render, but I had a little problem I will talk about below.</p> <h2>So how did it go?</h2> <p>Fortunately, for the cleaning part, I didn't have any trouble, and my PR is finally ready for review! The most complicated part was to write the tests, as this is something that requires attention to understand what needs to be tested, exactly. As for the UI part, I managed to have a slider working for the intensity, however, it was crashing the whole program for a reason, so I decided to leave this idea behind for now. Below, an example of how this should work:</p> <p><img alt="" height="744" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/slider.gif" width="744"></p> <h2>This Week's Goals</h2> <p>After a meeting with my mentors, we decided that this week's focus should be on finding a good usage example of the KDE rendering feature, to have it as a showcase of the capability of this API. Also, they hinted me some changes that need to be done regarding the API, so I will also invest some time on refactoring it.</p> <p>Wish me luck!</p> <p> </p>joaovdafloriano@usp.br (joaodellagli)Thu, 10 Aug 2023 14:08:14 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/ready-for-review/Week 9: It is Polishing Time!https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-9-it-is-polishing-time/<p>Hello everyone, it's time for another weekly blogpost! Today, I am going to update you on my project's latest changes.</p> <h2>Last Week's Effort</h2> <p>After having finished a first draft of the API that will be used for the KDE rendering, and showing how it could be used for other post-processing effects, my goal was to clean the code and try some details that would add to it so it could be better complete. Having that in mind, I invested in three work fronts:</p> <ol> <li>Fixing some bugs related to the rendering more than one post-processing effect actor.</li> <li>Experimenting with other rendering kernels (I was using the *gaussian* one only).</li> <li>Completing the KDE render by renormalizing the values in relation to the number of points (one of the core KDE details).</li> </ol> <p>Both three turned out more complicated than it initially seemed, as I will show below.</p> <h2>So how did it go?</h2> <p>The first one I did on monday-tuesday, and I had to deal with some issues regarding scaling and repositioning. Due to implementation choices, the final post-processed effects were rendered whether bigger than they were in reality, or out of their original place. After some time dedicated to finding the root of those problems, I could fix the scaling issue, however I realised I would need to,probably, rethink the way the api was implemented. As this general post-processing effects is a side-project that comes as a consequence of my main one, I decided to leave that investment to another time, as I would need to guarantee the quality of the second.</p> <p>The second was an easy and rather interesting part of my week, as I just needed to setup new kernel shaders. Based on <a href="https://scikit-learn.org/stable/modules/density.html">scikit-learn KDE documentation</a>, I could successfully implement the following kernels:</p> <ul> <li>Gaussian</li> <li>Tophat</li> <li>Epanechnikov</li> <li>Exponential</li> <li>Linear</li> <li>Cosine</li> </ul> <p>That outputted the following (beautiful) results for a set of 1000 random points with random sigmas:</p> <p><img alt="" height="375" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/kernels.png" width="563"></p> <p>The third one is still being a trickier challenge. If you recall from my first blogposts, I spent something around <em>one month</em> trying to setup float framebuffer objects to FURY with VTK so I could use them in my project. After spending all of that time with no results, me and Bruno, my mentor, <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-5-all-roads-lead-to-rome/">found a way</a> to do what we wanted to do, but using a different VTK class, <a href="https://vtk.org/doc/nightly/html/classvtkWindowToImageFilter.html">vtkWindowToImageFilter</a>. Well, it was a good workaround back then and it lead me all the way here, however now it is costing a price. The float framebuffers were an important part of the project because they would allow us to pass *<em>2-bit float information</em> from one shader to another, which would be important as they would allow the densities to have higher precision and more fidelity to the calculations. When rendering a KDE of a given set of points, we use the below function:</p> <p>KDE(x, y) = n<sup>-1</sup>sum( kernel(x, y) )</p> <p>If the number of points <strong>n</strong> is big enough, some KDE results will be really low. This presents a real problem to our implementation because, without the float framebuffers, it is currently only being possible to pass <em>8-bit unsigned char</em> information, that only allows 256 values. This is far from ideal, as low values would have alone densities low enough to disappear. This presented a problem as to renormalize the densities, I was retrieving the texture to the CPU, calculating its minimum and maximum values, and passing to the fragment shader as uniformsfor the renormalization, which didn't work if the maximum values calculated were zero.</p> <p> </p> <p>One solution I thought to solve that was a really heavy workaround: if an unsigned float is 32-bit and I have exactly 4 8-bit unsigned chars, why not try to pack this float into these 4 chars? Well, this is an interesting approach which I figured out is already an old one, being reported in <a href="https://developer.nvidia.com/gpugems/gpugems/part-ii-lighting-and-shadows/chapter-12-omnidirectional-shadow-mapping">GPU Gems's chapter 12</a>. Unfortunately I haven't tried yet this implementation yet, and went for one I thought myself, which haven't exactly worked. I also tried this implementation from <a href="https://aras-p.info/blog/2009/07/30/encoding-floats-to-rgba-the-final">Aras Pranckevičius' website</a>, which seems to be working, even though not perfectly:</p> <p><img alt="" height="462" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/blurred_kde.png" width="662"></p> <p>That looks way better, even though not ideal yet.</p> <h2>This Week's Goals</h2> <p>Talking with my mentors, we decided it was better if I focused on the version without the renormalization for now, as it was already done and running fine. So for this week, I plan to clean my PR to finally have it ready for a first review, and maybe add to it a little UI tool to control the intensity of the densities. That should take me some time and discussion, but I hope for it to be ready by the end of the week.</p> <p>Let's get to work!</p>joaovdafloriano@usp.br (joaodellagli)Wed, 02 Aug 2023 21:00:50 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-9-it-is-polishing-time/Week 8: The Birth of a Versatile APIhttps://blogs.python-gsoc.org/en/joaodellaglis-blog/week-8-the-birth-of-a-versatile-api/<p>Hello everyone, it's time for another weekly blogpost! Today, I am going to tell you all about how is the KDE API development going, and to show you the potential this holds for the future!</p> <h2>Last Week's Effort</h2> <p>Last week I told you how I managed to render some KDE renders to the screen, both in 2D and 3D, as you may check by my last blogpost. My new task was, as I had this example working, to start the API development. I;n a meeting with Bruno, one of my mentors, we debated on how could this work, reaching two options:</p> <ol> <li>Implement the KDE in a single, simple actor.</li> <li>Implement a KDE rendering manager, as a class.</li> </ol> <p>The first one would have the advantage of being simple and pretty straightforward, as a user would only need to call the actor and have it working on them hands, having the tradeoff of leaving some important steps for a clean API hidden and static. These steps I mention are related to how this rendering works, as I have previously <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-7-experimentation-done/">showed you</a>, it relies on post-processing effects, which need an offscreen rendering, that for example are done by the <em>callback functions</em>.</p> <p>In short, these functions are instructions the user gives to the interactor to run inside the interaction loop. Inside FURY there are tree types of callbacks passed to the windoe interactor:</p> <ol> <li><strong>Timer Callbacks</strong>: Added to the window interactor, they are a set of instructions that will be called from time to time, with interval defined by the user.</li> <li><strong>Window Callbacks</strong>: Added directly to the window, they are  a set of instructions called whenever an specific event is triggered.</li> <li><strong>Interactor Callbacks</strong>: Added to the window interactor, they are a set of instructions called whenever an specific interaction, for example a mouse left-click, is triggered.</li> </ol> <p>In this API, I will be using the <em>Interactor Callback</em>, set by the <em>window.add_iren_callback()</em> function, that will be called whenever a <em>Render</em> interaction is detected, and needs to be first passed to the onscreen manager.</p> <p>These details are more complicated, and would need, for example, for the user to pass the onscreen manager to the <em>actor.kde()</em> function. Also, in the case of a kde actor not being used anymore and being declared, the callback then passed would still exist inside the manager and be called even when the kde actor is not on screen anymore, which is not ideal.</p> <p>Knowing these problems, we thought of second option, that would have the advantage of not leaving those details and steps behind. It has the tradeoff of maybe complicating things as it would need to be called after calling the effects manager, but as I will show you below, it is not that complicated <em>at all</em>.</p> <p>I also reviewed my fellow GSoC contributors PR's as well, PR <a href="https://github.com/fury-gl/fury/pull/810">#810</a> and <a href="https://github.com/fury-gl/fury/pull/803">#803</a>. Bruno told me to take a look as well on <a href="https://www.conventionalcommits.org">Conventional Commits</a>, a way to standardtize commits by prefixes, so I did that as well.</p> <h2>So how did it go?</h2> <p>Well, the implemented manager class is named <em>EffectManager()</em> and to initialize it you only need to pass the onscreen manager:</p> <pre><code class="language-python">effects = EffectManager(onscreen_manager)</code></pre> <p>After that, to render a KDE calculation of points to the screen, you need only to call its <em>kde()</em> function:</p> <pre><code class="language-php">kde_actor = effects.kde(center, points, sigmas, scale = 10.0, colormap = "inferno") # Those last two are optional </code></pre> <p>Pass it to the onscreen manager scene:</p> <pre><code class="language-python">onscreen_manager.scene.add(kde_actor)</code></pre> <p>And to start it, as usual:</p> <pre><code class="language-python">onscreen_manager.start()</code></pre> <p>As simple as that. This three lines of code output the same result as I showed you last week, this time, with different sigmas for each point:</p> <p><img alt="" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/3d_kde_gif.gif"></p> <p>After having that working, I experimented beyond. See, as I previously said, we are dealing here with <em>post-processing effects</em>, with KDE being only one of the many existing ones, as this <a href="https://en.wikipedia.org/wiki/Video_post-processing">Wikipedia Page</a> on post processing shows. Knowing that, I tried one of the first filters I learned, the Laplacian one. This filter is, as its name hints, applying the <a href="https://en.wikipedia.org/wiki/Discrete_Laplace_operator">Discrete Laplace Operator</a> in an image. This filter shows suddend changes of value, a good way to detect borders. The process is the same as the kde actor, requiring only the actor you want to apply the filter to. Below, the result I got from applying that to a box actor:</p> <p><img alt="" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/laplacian1.gif"></p> <p>Something I found important to leave as an option was filter compositing. What if an user wanted to, for example, apply one laplacian filter after another? Well, the example below shows that is possible as well:</p> <p><img alt="" height="864" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/laplacian2.gif" width="864"></p> <p>It still need some tweaks and suffers from some bugs, but it works! Those represent important progress as it shows the versatility this API may present. I have also already implemented <em>grayscale</em> and <em>3x3 gaussian blur</em> as well:</p> <p><img alt="" height="439" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/gaussian_blur.png" width="748"><img alt="" height="427" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/grayscale.png" width="748"></p> <h2>This Week's Goals</h2> <p>My plans for this week are to keep working and polishing the API, mainly the KDE part, so it can be ready for a first review. When that is ready, I plan to experiment with more filters and make this more dynamic, maybe implementing a way to apply custom kernel transformations, passed by the user, to the rendering process. This has been a really exciting journey and I am getting happy with the results!</p> <p>Wish me luck!</p>joaovdafloriano@usp.br (joaodellagli)Tue, 25 Jul 2023 15:12:49 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-8-the-birth-of-a-versatile-api/Week 7: Experimentation Donehttps://blogs.python-gsoc.org/en/joaodellaglis-blog/week-7-experimentation-done/<p>Hello everyone, welcome to another weekly blogpost! Let's talk about the current status of my project (spoiler: it is beautiful).</p> <h2>Last Week's Effort</h2> <p>Having accomplished a KDE rendering to a billboard last week, I was then tasked with trying a different approach to how the rendering was done. So, to recap, below was how I was doing it:</p> <ol> <li>Render one point's KDE offscreen to a single billboard, passing its position and sigma to the fragment shader as uniforms.</li> <li>Capture the last rendering's screen as a texture.</li> <li>Render the next point's KDE, and sum it up with the last rendering's texture.</li> <li>Do this until the end of the points.</li> <li>Capture the final render screen as a texture.</li> <li>Apply post processing effects (colormapping).</li> <li>Render the result to the screen.</li> </ol> <p>This approach was good, but it had some later limitations and issues that would probably take more processing time and attention to details (correct matrix transformations, etc) than the ideal. The different idea is pretty similar, but with some differences:</p> <ol> <li>Activate additive blending in OpenGL.</li> <li>Render each point's KDE to its own billboard, with position defined by the point's position, all together in one pass.</li> <li>Capture the rendered screen as a texture.</li> <li>Pass this texture to a billboard.</li> <li>Apply post processing effects (colormapping).</li> <li>Render the result to the screen.</li> </ol> <p>So I needed to basically do that.</p> <h2>Was it Hard?</h2> <p>Fortunately, it wasn't so hard to do it in the end. Following those steps turned out pretty smooth, and after some days, I had the below result:</p> <p><img alt="" height="808" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/final_2d_plot.png" width="938"></p> <p>This is a 2D KDE render of random 1000 points. For this I used the *"viridis"* colormap from `matplotlib`. Some details worth noting:</p> <ul> <li>For this to work, I have implemented three texture helper functions: `window_to_texture()`, `texture_to_actor()` and `colormap_to_texture()`. The first one captures a window and pass it as a texture to an actor, the second one passes an imported texture to an actor, and the last one passes a colormap, prior passed as an array, as a texture to an actor.</li> <li>The colormap is directly get from `matplotlib`, available in its `colormaps` object.</li> <li>This was only a 2D flatten plot. At first, I could not figure out how to make the connection between the offscreen interactor and the onscreen one, so rotating and moving around the render was not happening. After some ponder and talk to my mentors, they told me to use *callback* functions inside the interactor, and after doing that, I managed to make the 3D render work, which had the following result:</li> </ul> <p><img alt="" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/3d_kde_gif.gif"></p> <p>After those results, I refactored my PR <a href="https://github.com/fury-gl/fury/pull/804">#804</a> to better fit its current status, and it is now ready for review. Success!</p> <h2>This Week's Goals</h2> <p>After finishing the first iteration of my experimental program, the next step is to work on an API for KDE rendering. I plan to meet with my mentors and talk about the details of this API, so expect an update next week. Also, I plan to take a better look on my fellow GSoC FURY contributors work so when their PRs are ready for review, I will have be better prepared for it.</p> <p>Let's get to work!</p>joaovdafloriano@usp.br (joaodellagli)Tue, 18 Jul 2023 18:41:23 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-7-experimentation-done/Week 6: Things are Starting to Build Uphttps://blogs.python-gsoc.org/en/joaodellaglis-blog/week-6-things-are-starting-to-build-up/<p>Hello everyone, time for a other weekly blogpost! Today, I will show you my current progress on my project and latest activities.</p> <h2>What I did Last Week</h2> <p>Last week I had the goal to implement KDE rendering to the screen (if you want to understand what this is, check my <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-5-all-roads-lead-to-rome/">last blogpost</a>. After some days diving into the code, I finally managed to do it:</p> <p><img alt="" height="785" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/buffer_compose.png" width="743"></p> <p>This render may seem clean and working, but the code isn't exactly like that. For this to work, some tricks and work arounds needed to be done, as I will describe in the section below.</p> <p>Also, I reviewed the shader part of Tania's PR <a href="https://github.com/fury-gl/fury/pull/791">#791</a>, that implement ellipsoid actors inside FURY. It was my first review of a PR that isn't a blogpost, so it was an interesting experience and I hope I can get better at it.</p> <p>It is important as well to point out that I had to dedicate myself to finishing my graduation capstone project's presentation that I will attend to this week, so I had limited time to polish my code, which I plan to do better this week.</p> <h2>Where the Problem Was</h2> <p>The KDE render basically works rendering the KDE of a point to a texture and summing that texture to the next render. For this to work, the texture, rendered to a billboard, needs to be the same size of the screen, otherwise the captured texture will include the black background. The problem I faced with that is that the billboard scaling isn't exactly well defined, so I had to guess for a fixed screen size (in this example, I worked with <em>600x600</em>) what scaling value made the billboard fit exactly inside the screen (it's <em>3.4</em>). That is far from ideal as I will need to modularize this behavior inside a function that needs to work for every case, so I will need to figure out a way to fix that for every screen size. For that, I have two options:</p> <ol> <li> <p>Find the scaling factor function that makes the billboard fit into any screen size.</p> </li> <li> <p>Figure out how the scaling works inside the billboard actor to understand if it needs to be refactored.</p> </li> </ol> <p>The first seems ok to do, but it is kind of a work around as well. The second one is a good general solution, but it is a more delicate one, as it deals with how the billboard works and already existing applications of it may suffer problems if the scaling is changed. I will see what is better talking with my mentors.</p> <p>Another problem I faced (that is already fixed) relied on shaders. I didn't fully understood how shaders work inside FURY so I was using my own fragment shader implementation, replacing the already existing one completely. That was working, but I was having an issue with the texture coordinates of the rendering texture. As I completely replaced the fragment shader, I had to pass custom texture coordinates to it, resulting in distorted textures that ruined the calculations. Those issues motivated me to learn the shaders API, which allowed me to use the right texture coordinates and finally render the results you see above.</p> <h2>This Week's Goals</h2> <p>For this week, I plan to try a different approach Filipi, one of my mentors, told me to do. This approach was supposed to be the original one, but a communication failure lead to this path I am currently in. This approach renders each KDE calculation into its own billboard, and those are rendered together with additive blending. After this first pass, this render is captured into a texture and then rendered to another big billboard.</p> <p>Also, I plan to refactor my draft PR <a href="https://github.com/fury-gl/fury/pull/804">#804</a> to make it more understandable, as its description still dates back to the time I was using the flawed Framebuffer implementation, and my fellow GSoC contributors will eventually review it, and to do so, they will need to understand it.</p> <p>Wish me luck!</p> <p> </p>joaovdafloriano@usp.br (joaodellagli)Wed, 12 Jul 2023 14:06:06 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-6-things-are-starting-to-build-up/Week 5: All Roads Lead to Romehttps://blogs.python-gsoc.org/en/joaodellaglis-blog/week-5-all-roads-lead-to-rome/<p>Hello everyone, time for another weekly blogpost! Today, we will talk about taking different paths to reach your objective.</p> <h2>Last Week's Effort</h2> <p>After having the FBO properly set up, the plan was to finally *render* something to it. Well, I wished for a less bumpier road at my <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-4-nothing-is-ever-lost/">last blogpost</a> but as in this project things apparently tend to go wrong, of course the same happened with this step.</p> <h2>Where the Problem Was</h2> <p>Days passed without anything being rendered to the FBO. The setup I was working on followed the simplest OpenGL pipeline of rendering to</p> <p>an FBO:</p> <p>1. Setup the FBO</p> <p>2. Attach a texture to its color attachment</p> <p>3. Setup the shader to be used in the FBO render and the shader to render the FBO's Color Attachment</p> <p>4. Render to the FBO</p> <p>5. Use the color attachment as texture attached to a billboard to render what was on the screen</p> <p>But it seems like this pipeline doesn't translate well into VTK. I paired again on wednesday with my mentors, Bruno and Filipi, to try to figure out where the problem was, but after hours we could not find it. Wednesday passed and then thursday came, and with thursday, a solution: Bruno didn't give up on the idea and dug deep on VTK's documentation until he found a workaround to do what we wanted, that was retrieving a texture from what was rendered to the screen and pass it as a texture to render to the billboard. To do it, he figured out we needed to use a different class, <a href="https://vtk.org/doc/nightly/html/classvtkWindowToImageFilter.html">vtkWindowToImageFilter </a>, a class that has the specific job of doing exactly what I described above. Below, the steps to do it:</p> <pre><code class="language-python">windowToImageFilter = vtk.vtkWindowToImageFilter() windowToImageFilter.SetInput(scene.GetRenderWindow()) windowToImageFilter.Update() texture = vtk.vtkTexture() texture.SetInputConnection(windowToImageFilter.GetOutputPort()) # Bind the framebuffer texture to the desired actor actor.SetTexture(texture)</code></pre> <p>This is enough to bind to the desired actor a texture that corresponds to was prior rendered to the screen.</p> <h2>This Week's Goals</h2> <p>Having a solution to that, now its time to finally render some KDE's! This week's plans involve implementing the first version of a KDE calculation. For anyone interested in understanding what a Kernel Density Estimation is, here is a brief summary from this <a href="https://en.wikipedia.org/wiki/Kernel_density_estimation">Wikipedia page</a>:</p> <p>"In statistics, kernel density estimation (KDE) is the application of kernel smoothing for probability density estimation, i.e., a non-parametric method to estimate the probability density function of a random variable based on kernels as weights. KDE answers a fundamental data smoothing problem where inferences about the population are made, based on a finite data sample. In some fields such as signal processing and econometrics it is also termed the Parzen–Rosenblatt window method, after Emanuel Parzen and Murray Rosenblatt, who are usually credited with independently creating it in its current form. One of the famous applications of kernel density estimation is in estimating the class-conditional marginal densities of data when using a naive Bayes classifier, which can improve its prediction accuracy."</p> <p>This complicated sentence can be translated into the below image:</p> <p><img alt="" height="345" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/KDE_plot.png" width="560"></p> <p>That is what a KDE plot of 100 random points look like. The greener the area, the greater the density of points. The plan is to implement something like that with the tools we now have available.</p> <p>Let's get to work!</p> <p> </p>joaovdafloriano@usp.br (joaodellagli)Tue, 04 Jul 2023 21:00:28 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-5-all-roads-lead-to-rome/Week 4: Nothing is Ever Losthttps://blogs.python-gsoc.org/en/joaodellaglis-blog/week-4-nothing-is-ever-lost/<p> </p> <p>Welcome again to another weekly blogpost! Today, let's talk about the importance of guidance throughout a project.</p> <h2>Last Week's Effort</h2> <p>So, last week my project was struggling with some supposedly simple in concept, yet intricate in execution issues. If you recall from my <a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-3-watch-your-expectations/">last blogpost</a>, I could not manage to make the Framebuffer Object setup work, as its method, <em>SetContext()</em>, wasn't being able to generate the FBO inside OpenGL. Well, after some (more) research about that as I also dived in my plan B, that involved studying numba as a way to accelerate a data structure I implemented on my PR <a href="https://github.com/fury-gl/fury/pull/783">#783</a>, me and one of my mentors decided we needed a pair programming session, that finally happened on thursday. After that session, we could finally understand what was going on.</p> <p> </p> <h2>Where the Problem Was</h2> <p>Apparently, for the FBO generation to work, it is first needed to initialize the context interactor:</p> <pre><code class="language-python">FBO = vtk.vtkOpenGLFramebufferObject() manager.window.SetOffScreenRendering(True) # so the window doesn't show up, but important for later as well manager.initialize() # missing part that made everything work FBO.SetContext(manager.window) # Sets the context for the FBO. Finally, it works FBO.PopulateFramebuffer(width, height, True, 1, vtk.VTK_UNSIGNED_CHAR, False, 24, 0) # And now I could populate the FBO with textures</code></pre> <p><br> This simple missing line of code was responsible for ending weeks of suffer, as after that, I called:</p> <pre><code class="language-python">print("FBO of index:", FBO.GetFBOIndex()) print("Number of color attachments:", FBO.GetNumberOfColorAttachments())</code></pre> <p>That outputted:</p> <pre><code class="language-python">FBO of index: 4 Number of color attachments: 1</code></pre> <p>That means the FBO generation was successful! One explanation that seems reasonable to me on why was that happening is that, as it was not initialized, the context was being passed <em>null</em> to the <em>SetContext()</em> method, that returned without any warning of what was happening.</p> <p>Here, I would like to point out how my mentor was <strong>essential</strong> to this solution to come: I had struggled for some time with that, and could not find a way out, but a single session of synchronous pair programming where I could expose clearly my problem and talk to someone way more experienced than I, someone designated for that, was my way out of this torment, so value your mentors! Thanks Bruno!</p> <h2>This Week's Goals</h2> <p>Now, with the FBO working, I plan to finally <strong>render</strong> something to it. For this week, I plan to come back to my original plan and experiment with simple shaders just as a proof of concept that the FBO will be really useful for this project. I hope the road is less bumpier by now and I don't step on any other complicated problem.</p> <p>Wish me luck!</p>joaovdafloriano@usp.br (joaodellagli)Mon, 26 Jun 2023 21:17:07 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-4-nothing-is-ever-lost/Week 3: Watch Your Expectationshttps://blogs.python-gsoc.org/en/joaodellaglis-blog/week-3-watch-your-expectations/<p>Hello everyone, it's time for another weekly blogpost! This week, I will talk about how you should watch your expectations when working with any project.</p> <h2>This Last Week's Effort</h2> <p>As I supposedly managed to make the texture allocation part working, this last week's goal was to render something to a FBO. Well, I could make textures work, but what I wasn't expecting and later realised, was that the FBO setup not working. Below I will describe where I got stuck.</p> <h2>Where the Problem Was</h2> <p>After getting the textures setup right, I was ready to render some color to the FBO. Well, I <strong>was</strong>, because I didn't expect I would have another problem, this time, with the FBO setup. As described in my<a href="https://blogs.python-gsoc.org/en/joaodellaglis-blog/the-fbo-saga-week-1/"> week 1 blogpost</a>, a FBO needs some requirements to work. My current problem relies on the FBO method <em>FBO.SetContext()</em>, that for some reason is not being able to generate the FBO. Below, how the method is currently operating:</p> <p><img alt="" height="711" src="https://raw.githubusercontent.com/JoaoDell/gsoc_assets/main/images/setcontext.png" width="682"></p> <p>Apparently, the method is stuck before the <em>this-&gt;CreateFBO()</em>, that can be checked when we call<em> FBO.GetFBOIndex()</em>, that returns a<em> 0</em> value, meaning the FBO was not generated by the <em>glGenFramebuffers()</em> function, that is inside the <em>GetContext()</em> method.</p> <h2>This Week's Goals</h2> <p>As I got stuck again with this simple step, talking with my mentors we concluded that a plan B is needed for my GSoC participation as my current project is not having much progress. This plan B that I am gonna start working on this week involves working on <a href="https://github.com/fury-gl/furyspeed">FURY Speed</a>, a FURY addon that aims to develop optimized functions and algorithms to help speed up graphical applications. The suggestion was to work on a PR I submitted months ago, <a href="https://github.com/fury-gl/fury/pull/783">#783</a>, in a way to integrate that into FURY Speed. Also, I plan to keep working on my current project to find the solution I will need to make the FBO usage work.</p> <p>Let's get to work!</p>joaovdafloriano@usp.br (joaodellagli)Wed, 21 Jun 2023 17:39:41 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/week-3-watch-your-expectations/The Importance of (good) Documentation - Week 2https://blogs.python-gsoc.org/en/joaodellaglis-blog/the-importance-of-good-documentation-week-2/<p>Hello everybody, welcome to the week 2 of this project! I must admit I thought this would be simpler than it is currently being, but I forgot that when it comes to dealing with computer graphics' applications, things never are. Below, some updates on what I have been up to for this past week. </p> <h2>This Last Week's Effort</h2> <p>Last week, I was facing some issues with a VTK feature essential so I could move forward with my project: Framebuffer Objects. As described in my <a href="https://blogs.pythongsoc.org/en/joaodellaglis-blog/the-fbo-saga-week-1/">last blogpost</a>, for some reason the 2D allocation methods for it weren't working. In a meeting with my mentors, while we were discussing and searching through VTK's FramebufferObject and TextureObject documentation, and the code itself for the problem, one TextureObject method caught my attention: <a href="https://vtk.org/doc/nightly/html/classvtkTextureObject.html#a0988fa2a30b640c93392c2188030537e">vtkTextureObject.SetContext()</a>.</p> <h2>Where the Problems Were</h2> <p>My last week's code was:</p> <pre><code class="language-python">color_texture = vtk.vtkTextureObject() # color texture declaration color_texture.Bind() # binding of the texture for operations color_texture.SetDataType(vtk.VTK_UNSIGNED_CHAR) # setting the datatype for unsigned char color_texture.SetInternalFormat(vtk.VTK_RGBA) # setting the format as RGBA color_texture.SetFormat(vtk.VTK_RGBA) color_texture.SetMinificationFilter(0) # setting the minfilter as linear color_texture.SetMagnificationFilter(0) # setting the magfilter as linear color_texture.Allocate2D(width, height, 4, vtk.VTK_UNSIGNED_CHAR) # here is where the code stops </code></pre> <p>But it turns out that to allocate the FBO's textures, of type vtkTextureObject, you need to also set the context where the texture object will be present, so it lacked the line:</p> <pre><code class="language-python">color_texture.SetContext(manager.window) # set the context where the texture object will be present</code></pre> <p>And with that line added after Bind():</p> <pre><code class="language-python">color_texture = vtk.vtkTextureObject() color_texture.Bind() color_texture.SetContext(manager.window) # set the context where the texture object will be present color_texture.SetDataType(vtk.VTK_UNSIGNED_CHAR) color_texture.SetInternalFormat(vtk.VTK_RGB) color_texture.SetFormat(vtk.VTK_RGB) color_texture.SetMinificationFilter(0) color_texture.SetMagnificationFilter(0)</code></pre> <p>The code worked fine. But as my last blogpost showed, Allocate3D() method worked just fine without a (visible) problem, why is that? Well, in fact, it <strong>didn't work</strong>. If we check the code for the Allocate2D and Allocate3D, one difference can be spotted.</p> <p>While in Allocate2D there is an "<em>assert(this-&gt;Context);"</em>, in Allocate3D the assertion is translated into:</p> <pre><code class="language-cpp">if(this-&gt;Context==nullptr) { vtkErrorMacro("No context specified. Cannot create texture."); return false; }</code></pre> <p>This slight difference is significant: while in Allocate2D the program immediatly fails, in Allocate3D the function is simply returned <strong>false</strong>, with its error pushed to vtkErrorMacro. I could have realised that earlier if I were using vtkErrorMacro, but this contrastant implementation made it harder for me and my mentors to realise what was happening.</p> <h2>This Week's Goals</h2> <p>After making that work, this week's goal is to render something to the Framebuffer Object, now that is working. To do that, first I will need to do some offscreen rendering to it, and afterwards render what it was drawn to its color attachment, the Texture Object I was struggling to make work, into the screen, drawing its texture to a billboard. Also, I plan to start using vtkErrorMacro, as it seems like the main error interface when working with VTK, and that may make my life easier.</p> <p>See you next week!</p>joaovdafloriano@usp.br (joaodellagli)Tue, 13 Jun 2023 13:40:46 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/the-importance-of-good-documentation-week-2/The FBO Saga - Week 1https://blogs.python-gsoc.org/en/joaodellaglis-blog/the-fbo-saga-week-1/<h2 style="text-align: justify;">This Past Week</h2> <p style="text-align: justify;">As mentioned in the last week's blogpost, the goal for that week was to investigate VTK's Framebuffer Object framework. An update on that is that indeed, VTK has one more low-level working <a href="https://vtk.org/doc/nightly/html/classvtkOpenGLFramebufferObject.html">FBO class</a> that can be used inside FURY, however, they come with some issues that I will explain further below.</p> <p style="text-align: justify;"> </p> <h2 style="text-align: justify;">My Current Problems</h2> <p style="text-align: justify;">The problems I am having with these FBO implementations are first something related to how an FBO works, and second related to how VTK works. In OpenGL, a custom user's FBO needs some things to be complete (usable):</p> <ol> <li style="text-align: justify;">At least one buffer should be attached. This buffer can be the color, depth or stencil buffer.</li> <li style="text-align: justify;">If no color buffer will be attached then OpenGL needs to be warned no draw or read operations will be done to that buffer. Otherwise, there should be at least one color attachment.</li> <li style="text-align: justify;">All attachments should have their memory allocated.</li> <li style="text-align: justify;">Each buffer should have the same number of samples.</li> </ol> <p style="text-align: justify;">My first problem relies on the third requirement. VTK's implementation of FBO requires a <a href="https://vtk.org/doc/nightly/html/classvtkTextureObject.html">vtkTextureObject</a> as a texture attachment. I figured how to work with this class, however, I cannot allocate memory for it, as its methods for it, <a href="https://vtk.org/doc/nightly/html/classvtkTextureObject.html#abc91bbf9a3414bded7a132d366ca4951">Allocate2D</a>, <a href="https://vtk.org/doc/nightly/html/classvtkTextureObject.html#a7e9dd67f377b7f91abd9df71e75a5f67">Create2D</a> and <a href="https://vtk.org/doc/nightly/html/classvtkTextureObject.html#a0e56fe426cb0e6749cc6f2f8dbf53ed7">Create2DFromRaw</a> does not seem to work. Every time I try to use them, my program stops with no error message nor nothing. For anyone interested in what is happening exactly, below is how I my tests are implemented:</p> <p style="text-align: justify;"> </p> <pre><code class="language-python">color_texture = vtk.vtkTextureObject() # color texture declaration color_texture.Bind() # binding of the texture for operations color_texture.SetDataType(vtk.VTK_UNSIGNED_CHAR) # setting the datatype for unsigned char color_texture.SetInternalFormat(vtk.VTK_RGBA) # setting the format as RGBA color_texture.SetFormat(vtk.VTK_RGBA) color_texture.SetMinificationFilter(0) # setting the minfilter as linear color_texture.SetMagnificationFilter(0) # setting the magfilter as linear color_texture.Allocate2D(width, height, 4, vtk.VTK_UNSIGNED_CHAR) # here is where the code stops</code></pre> <p style="text-align: justify;"> </p> <p style="text-align: justify;">In contrast, for some reason, the methods for 3D textures, <a href="https://vtk.org/doc/nightly/html/classvtkTextureObject.html#aaeefa46bd3a24bf62126512a276819d0">Allocate3D</a> works just fine.I could use it as a workaround, allocating memory for a width x height x 1 texture (equivalent to a 2D texture) but I do not wish to, as this just does not make sense.</p> <p style="text-align: justify;">My second problem relies on VTK. As VTK is a library that encapsulates some OpenGL functions in more palatable forms, it comes with some costs. Working with FBOs is a more low-level work, that requires strict control of some OpenGL states and specific functions that would be simpler if it was the main API here. However, some of this states and functions are all spread and implicit through VTK's complex classes and methods, which doubles the time expended to make some otherwise simple instructions, as I first need to dig in lines and lines of VTK's documentation, and worse, the code itself.</p> <p style="text-align: justify;"> </p> <h2 style="text-align: justify;">What About Next Week?</h2> <p style="text-align: justify;">For this next week, I plan to investigate further on why the first problem is happening. If that is accomplished, then things will be more simple, as I will be a lot easier for my project as I will finally be able to implement the more pythonic functions I will need to finally render some kernel distributions onto my screen.</p> <p style="text-align: justify;">Wish me luck!</p> <p> </p> <p> </p> <p> </p>joaovdafloriano@usp.br (joaodellagli)Mon, 05 Jun 2023 19:54:45 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/the-fbo-saga-week-1/The Beggining of Everything - Week 0https://blogs.python-gsoc.org/en/joaodellaglis-blog/the-beggining-of-everything-week-0/<h2><strong><em>So It Begins...</em></strong></h2> <p>Hello everyone, welcome to the beggining of my journey through GSoC 2023! I would like to thank everyone involved for the opportunity provided, it is an honour to be working side by side with professionals and so many experienced people from around the world.</p> <h2><strong>The Community Bonding Period</strong></h2> <p>During my community bonding period, I had the opportunity to meet my mentors ans some people from the FURY team. It was a great time to learn about community guidelines and everything I will need to work with them during this summer.</p> <p> </p> <h2><strong>The Project's Goal</strong></h2> <p>For some reason I am having trouble with uploading my proposal, so I will introduce my goals. Briefly explaining this project, I plan to implement a real-time Kernel Density Estimation shader inside FURY library, based on<a href="https://github.com/filipinascimento/PACSExplorer/blob/782e52334a635528ec3ab4c7a4409cc88958d3ba/lib/density-gl.js"> Filipi Nascimento's WebGL implementation</a>. KDE, or Kernel Density Estimation, is a visualization technique that provides a good macro visualization of large and complex data sets, like point clouds, well summarizing their spatial distribution in smooth areas. I really think FURY will benefit from this as a scientific library, knowing it is a computer graphics library that originated in 2018 based on the Visual Toolkit API (VTK), and has been improving since then.  </p> <h2><strong>This Week's Goal</strong></h2> <p>For all of this to work, the project needs one component working: the KDE framebuffer. As this <a href="https://www.khronos.org/opengl/wiki/Framebuffer">Khronos wiki page</a> well explains:</p> <p>"A <b>Framebuffer</b> is a collection of buffers that can be used as the destination for rendering. OpenGL has two kinds of framebuffers: the <a href="https://www.khronos.org/opengl/wiki/Default_Framebuffer" title="Default Framebuffer">Default Framebuffer</a>, which is provided by the <a href="https://www.khronos.org/opengl/wiki/OpenGL_Context" title="OpenGL Context">OpenGL Context</a>; and user-created framebuffers called <a href="https://www.khronos.org/opengl/wiki/Framebuffer_Object" title="Framebuffer Object">Framebuffer Objects</a> (FBOs). The buffers for default framebuffers are part of the context and usually represent a window or display device. The buffers for FBOs reference images from either <a href="https://www.khronos.org/opengl/wiki/Texture" title="Texture">Textures</a> or <a href="https://www.khronos.org/opengl/wiki/Renderbuffer" title="Renderbuffer">Renderbuffers</a>; they are never directly visible."</p> <p>Which means that a framebuffer is an object that stores data related to a frame. So the goal for this week is to investigate whether VTK, the API which FURY is written on, has a framebuffer object interface, and if it has, to understand how it works and how to use it for the project. <br> <br> Let's get to work!</p>joaovdafloriano@usp.br (joaodellagli)Mon, 29 May 2023 16:51:59 +0000https://blogs.python-gsoc.org/en/joaodellaglis-blog/the-beggining-of-everything-week-0/