All posts by admin

Visual Studio has triggered a breakpoint

Yesterday I wasted a lot of time tracking a bug which turned out to be pretty instructive.

Tramadol Online Price I am working on a tiled deferred renderer, and after adding a bunch of features, and before starting new ones, I spent some time on cleaning the development mess.

https://fotballsonen.com/2024/03/07/ixlwej22 The DirectX debug device was complaining about some objects I forgot to release, so I made sure that my destructors were all doing their job. But then each time I closed the program, this message showed up:

https://worthcompare.com/cq44186x  Visual Studio has triggered a Breakpoint

Well, it’s not really self-explanatory. I had set no breakpoint, visual studio has triggered the breakpoint itself. Clicking break leads to a SAFE_RELEASE() in the destructor of one of my singletons. When I tried “continue” the program terminated without any other errors/messages.

https://www.lcclub.co.uk/u971pxk1bv I first tried to comment the supposed faulty line. No more errors, but some DirectX objects are still alive. I thought that maybe the device had to be released last. I tried that, but the message came back, and breaking stop to SAFE_RELEASE(m_pd3dDevice). In fact I understand that it will always break on the last D3D released object, but if I let an object alive the message will not pop up (but an object will not be destroyed, that’s not a solution).

https://wasmorg.com/2024/03/07/o4ltr3k20 Obviously the bug was memory related, so I tried some analysis tools. Each one pointed me a different direction, far from the real solution. So I begin a less subtle debugging method: comment everything!

http://countocram.com/2024/03/07/zh8lmuxnto Since it was a memory problem I let only creations and destructions of my objects, removing the calls to the update and renders functions. And surprisingly it worked, no error, everything was destroyed. I then called the update functions, ok, and the render functions, and the breakpoint is back.

I continued to isolate the problem inside the rendering functions, until I have only the following code left:

 

https://asperformance.com/uncategorized/5omu0ooogrx // Set render targets. ID3D11RenderTargetView* RTViews[2] = { ppAlbedo, ppNormal}; engine->GetImmediateContext()->OMSetRenderTargets(2, RTViews, 0); // Clear render targets quadRenderer->SetPixelShader(m_pClearPixelShader); quadRenderer->Render();

https://elisabethbell.com/zv94630  

Tramadol With Mastercard I started to think that I had messed up with my quad rendering or render targets, but in fact it turns out that the evil one was the apparently innocent SetPixelShader function. It’s a simple (yet horrible) setter:

https://ncmm.org/tj3oe79 void QuadRenderer::SetPixelShader(ID3D11PixelShader* ppPixelShader) {     m_pQuadPixelShader = ppPixelShader; }

Tramadol Buy Online Uk  

This is wrong, (I’ll come back to that later), but not harmful per se.  The true horror lies in the QuadRenderer destructor:

https://www.mominleggings.com/3ur5gk1ab  

QuadRenderer::~QuadRenderer(void) {     SAFE_RELEASE(m_pQuadVertexLayout);     SAFE_RELEASE(m_pQuadVertexShader);     SAFE_RELEASE(m_pQuadVertexBuffer);     SAFE_RELEASE(m_pQuadIndexBuffer);     SAFE_RELEASE(m_pQuadPixelShader); }

Tramadol Online Overnight Shipping  

https://fotballsonen.com/2024/03/07/pgffjhy34b The pixel shader is released, but which pixel shader ? It wasn’t created by the QuadRenderer, his owner must have already released it. SAFE_RELEASE check that the pointer is not null, but in that case the pointer still points to something, something that had already been released, leading to the land of unknown behaviors, or worse.

https://musiciselementary.com/2024/03/07/324x5owip I learned several lessons thanks to that bug.

–        Destructors are not the funniest part of the code, but it needs to be done carefully, you can’t just look at the member variables and delete them all, it can be trickier than that.

–        Despite of his name, the macro SAFE_RELEASE is not that safe (obvious, but it’s easy to forget).

–        Poor design can lead to annoying bugs. When I implemented the QuadRenderer class I was thinking: “Ok, I’ll need a vertex and pixel shader, but I want to be able set the pixel shader I want”. This is wrong. I don’t want my QuadRenderer to have a pixel shader, I want it to https://musiciselementary.com/2024/03/07/5p49buh9d render with a particular pixel shader. There is no need to save it. This is something to be careful, so that your destructors can be trivial.

https://tankinz.com/trdlqlkhol Well, now everything is clean and I’ve learned much more than I would have thought, I can start adding new features !

“Physical” midi user interface for lazy programmers

https://www.worldhumorawards.org/uncategorized/ae6lx5z2a It’s been a while since my last post, but I’ve been busy, new job, new continent, etc…

https://asperformance.com/uncategorized/x89mdyqc This summer I was working at Ubisoft Paris, and for one of my tasks I needed to create a sample program, to implement and an effect.  It takes some time to rewrite tools you usally have in the engine like shader builder, DX entities creators, camera class, etc, but what I really found time consuming, and annoying was everything UI related. There were a lot of settings, and the goal being to explore all of the possibilities, almost all of them had to be exposed to the user. It’s such a pain, creating your sliders/buttons/whatever, setting the position, width, height, initializing, drawing and updating, etc. I used DXUT’s UI components, I’m sure there is better tools out there, but I have to admit,  I don’t like UI programming, so I wanted to find a better way for my future projects.

So one day I grabbed my midi keyboard, wondering how hard it was to get the inputs. Turns out it’s super easy. Using the library RtMidi, and I was able to get my inputs in no time. The following week I bought a small midi controller, the Korg NanoKontrol2.

https://www.lcclub.co.uk/bp715f3n0i Midi user interface

 

 

Look at that, it’s a physical “G”UI!

There is everything I could want on this controller, sliders, knobs and buttons with light feedbacks. So I started to write a small midi input manager for my current project, trying to make it easy to use. There is just a simple Update() function that will receive/send all the midi messages, and all I have to do is MidiInputManager::Instance()->GetMidiValue(NKI_F1) to get the current value of the first fader. And that’s it !

Midi values are in the range 0 – 127, so I had to transform them, an annoying and error prone, so I added an initialization function: MidiInputManager::Instance()->SetMinAndMaxValues(NKI_F1, 50, 500), and the results of the GetMidiValue function are already in the correct range.

 

Of course there are some drawbacks. The faders and knobs are not motorized, meaning that if you have saved a default value, you can’t see it on the fader, and you will lose it as soon as you move the fader.

You have only 128 different values so you really need to set the correct range to have a good precision.

You have only 8 knobs and 8 faders. It’s not really a problem since you can define multiple configurations using the buttons. For example for each faders there are 3 buttons, (S)olo, (M)ute and (R)ecord. I’ve linked them has a group, meaning that only one of the three can be active at a time. So the associated fader can control a red channel while S is on, blue when M is on and so on. I’m also thinking to use multiple global configurations if I really need more buttons. Using the “track” buttons, I would be able to press next and be in “configuration 2” and have a different mappings.

I also notice some lag sometime, I’ll have to take a look at that.

 

I’ve uploaded a first version on github, and I’ll update it as soon as I had features. For now it’s only for the nanokontrol2, but it’s easy to port to other controllers.

I’ve made a quick video of my tiled deferred renderer to show how it can be used:

 

I’m sure I’m not the only one to do that, but I hope it will help or inspire someone!

Udacity Introduction to Parallel Programming CS344 VS 2012 Solution

I’m currently folowing the great Udacity lessons about parallel programming and CUDA.

I made a Visual Studio 2012 solution to do some quick test for the assignement and I was thinking that it could be usefull to someone, so I put it on GitHub. I’ll try to update it for each lessons.

You can find it there.

Before being able to launch it you may need to do some steps.

First, you need to download openCV and unzip it to C:/Program Files/opencv, or change the directory accordingly in the “VC++ Directories” of the projet properties. Of course, you also need the CUDA SDK.

You also may need to change the Build Customizations in the Visual Studio solution. Right click on the project and select Buil Customization. If you don’t see a CUDA configuration file, click on “Find existing” and add the CUDA target files located at “C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\BuildCustomizations”

Right click on the .cu file, and make sure that the Item Type is set to “CUDA C/C++”

Hope it helps, and let me know if there is any trouble.

Particles – Collision and Flow Maps

Un nouveau post sur les particules pour présenter deux features, les collisions et l’utilisation de flow maps.

Tout d’abord les collisions avec une petite vidéo:

 

 

Ces particules sont ce que l’on appelle statefull, ce qui signifie qu’elles conservent leur état précédent, et peuvent l’utiliser pour réagir en fonction de l’environnement.
Cela me permet de leur appliquer une physique rudimentaire (la gravité et l’attraction par exemple), et de réagir aux collisions avec les bords de l’écran. Mais cela permet aussi de réagir avec un environnement potentiellement dynamique. J’utilise une texture de collision. Pour chaque particule, je regarde si à sa nouvelle position se trouve une information dans cette texture. Si c’est le cas il y a une collision et la particule réagis en conséquence.
Ici j’écris du texte dans la texture de collision, mais cela peut être n’importe quoi, et peut même être dynamique.

La vidéo suivante montre l’utilisation de flow map pour diriger simultanément toutes les particules:

 

 

J’ai tout d’abord utilisé Flow Map Painter pour créer la flow map, qui est en fait un ensemble de vecteurs.

 

Flow Map Painter

 

Cette map étant créé, je l’utilise dans la passe de physique pour influencer la direction des particules. Cela permet de coordonner facilement le mouvement d’un million de particules.

C++/DirectX 11 – 48h Deferred Rendering Engine

Pendant la Global Game Jam j’ai codé un début de moteur deferred en C++/DX11 (donc c’était plus une global “tech” jam) pour me familiariser avec cette API. Seules les lumières directionnelles sont implémentées à l’heure actuelle.
Il y a encore beaucoup de choses à implémenter et a nettoyer, mais je travaille toujours dessus, et compte en faire un moteur de voxels.

Le code source est disponible sur GitHub.

 

During the Global Game Jam I coded a deferred engine in C + + / DX11 (so it was more of a global “tech” jam) to get familiar with this API. Only directional lights are working right now.
There are still many things to implement and clean, but I’m still working on it, trying to make a voxel engine.

Source code is available here.

c++ Deferred Rendering

GPU Particles

English version is coming soon !

Une première vidéo pour montrer et expliquer le fonctionnement de base de mon moteur de particules.

Tous les calculs de mise à jour, physique et collisions s’exécutent sur le GPU, ce qui permet d’avoir de bonnes performances pour un grand nombre de particules (ici 1 000 000 de particules, locké à 30 fps pour les besoin de l’enregistrement).

Toutes les informations dynamiques des particules (position X et Y dans les canaux RG et velocité X et Y dans les canaux BA) sont stockées dans une texture (ici 1024×1024) Chaque particule est identifié par un ensemble de trois vertices. A la place de leur position est stocké une coordonnée de texture, qui permet de retrouvé les informations dans la texture contenant les données.

La mise à jour se déroule en deux temps. Tout d’abord il y a une phase de mise à jour de la physique. En dessinant un quad fullscreen, pour chaque pixel de la texture de données on extrait les informations de la frame précédente afin d’en déduire celles de la frame courante, en fonction de la gravité, des collisions, des forces externes, etc. Ensuite vient la phase d’affichage. On envoie à la carte les vertices représentant chaque particule, et dans le vertex shader, grâce au Vertex Texture Fetching et aux UVs, on retrouve la position réelle ce qui permet d’afficher un triangle au bon endroit.

On peut voir dans la vidéo l’influence d’une force d’attraction contrôlée par la souris et celle de la gravité. Il n’y a de collisions qu’avec le bord de l’écran. La couleur des particules peut être soit fixe, soit influencée par leur vélocité. On voit aussi un post process qui dessine une couleur en fonction de la densité des particules, donnant un aspect “fluide”.

Dans la prochaine vidéo je montrerai les collisions avec des objets dynamiques, ainsi que l’utilisation de flowmaps pour influencer le mouvement de toutes les particules.

Le code source est disponible sur github.

Le setup du projet est téléchargeable ici.