I once met a person that never drank water, only soft drinks. It’s not the unhealthiness of this that disturbed me, but the fact they did it without the requisite paperwork.

Unlike those disorganised people I have a formal waiver. I primarily drink steam and crushed glaciers.

  • 1 Post
  • 155 Comments
Joined 2 years ago
cake
Cake day: June 14th, 2023

help-circle


  • Ooh thankyou for the link.

    “We can leverage it [ray tracing] for things we haven’t been able to do in the past, which is giving accurate hit detection”

    “So when you fire your weapon, the [hit] detection would be able to tell if you’re hitting a pixel that is leather sitting next to a pixel that is metal”

    “Before ray tracing, we couldn’t distinguish between two pixels very easily, and we would pick one or the other because the materials were too complex. Ray tracing can do this on a per-pixel basis and showcase if you’re hitting metal or even something that’s fur. It makes the game more immersive, and you get that direct feedback as the player.”

    It sounds like they’re assigning materials based off the pixels of a texture map, rather than each mesh in a model being a different material. ie you paint materials onto a character rather than selecting chunks of the character and assigning them.

    I suspect this either won’t be noticeable at all to players or it will be a very minor improvement (at best). It’s not something worth going for in exchange for losing compatibility with other GPUs. It will require a different work pipeline for the 3D modellers (they have to paint materials on now rather than assign them per-mesh), but that’s neither here nor there, it might be easier for them or it might be hell-awful depending on the tooling.

    This particular sentence upsets me:

    Before ray tracing, we couldn’t distinguish between two pixels very easily

    Uhuh. You’re not selling me on your game company.

    “Before” ray tracing, the technology that has been around for decades. That you could do on a CPU or GPU for this very material-sensing task without the players noticing for around 20 years. Interpolate UVs across the colliding triangle and sample a texture.

    I suspect the “more immersion” and “direct feedback” are veils over the real reasoning:

    During NVIDIA’s big GeForce RTX 50 Series reveal, we learned that id has been working closely with the GeForce team on the game for several years (source)

    With such a strong emphasis on RT and DLSS, it remains to be seen how these games will perform for AMD Radeon users

    No-one sane implements Nvidia or AMD (or anyone else) exclusive libraries into their games unless they’re paid to do it. A game dev that cares about its players will make their game run well on all brands and flavours of graphics card.

    At the end of the day this hurts consumers. If your games work on all GPU brands competitively then you have more choice and card companies are better motivated to compete. Whatever amount of money Nvidia is paying the gamedevs to do this must be smaller than what they earn back from consumers buying more of their product instead of competitors.


  • really flashy guns and there is a very intricate damage system that runs at least partially on the GPU.

    Short opinion: no, CPU’s can do that fine (possibly better) and it’s a tiny corner of game logic.

    Long opinion: Intersecting projectile paths with geometry will not gain advantages being moved from CPU to GPU unless you’re dealing with a ridiculous amount of projectiles every single frame. In most games this is less than 1% of CPU time and moving it to the GPU will probably reduce overall performance due to the latency costs (…but a lot of modern engines already have awful frame latency, so it might fit right in fine).

    You would only do this if you have been told by higher ups that you have to OR if you have a really unusual and new game design (thousands of new projectile paths every frame? ie hundreds of thousands of bullets per second). Even detailed multi-layer enemy models with vital components is just a few extra traces, using a GPU to calc that would make the job harder for the engine dev for no gain.

    Fun answer: checkout CNlohr’s noeuclid. Sadly no windows build (I tried cross compiling but ended up in dependency hell), but still compiles and runs under Linux. Physics are on the GPU and world geometry is very non-traditional. https://github.com/cnlohr/noeuclid


  • Anything odd with temperatures or power draws perhaps? nv-top shows both for me (but I run an AMD GPU + non-proprietary drivers), otherwise lm-sensors might be good.

    nvtop seems to show normal usage

    Neither the GPU nor CPU utilisation change at the 30 min mark? If one is pegged at 100% then it’s probably hard to work out what is going on. Running a singleplayer game staring at a wall and configured with limited framrate might let you run both the CPU and GPU at less than 100%, perhaps making it easier to see if one or the other suddenly changes.






  • Title of PCGamer’s article is misleading, they want a court order to do it. Proof of death is not enough.

    “In general, your GOG account and GOG content is not transferable. However, if you can obtain a copy of a court order that specifically entitles someone to your GOG personal account, the digital content attached to it taking into account the EULAs of specific games within it, and that specifically refers to your GOG username or at least email address used to create such an account, we’d do our best to make it happen. We’re willing to handle such a situation and preserve your GOG library—but currently we can only do it with the help of the justice system.”

    They have to do that anyway. Court orders overrule a company’s policies in most (all?) legal systems.










  • I was amazed that we transitioned from one GPU heavy bubble (Crypto) to another (LLM/AI). Whilst the hype for crypto imploded the use for the hardware sort of didn’t. I wonder if the next bubble with be the same, or if we get some refreshing variety to our money sinks?

    Microsoft et al are subsidizing GenAI to an insane degree. […] prices shoot up for their customers and serve as a rough awakening to all the websites that integrated a crappy chatbot.

    I’ve run some much simpler chatbots on just my desktop PC, so they will have some fallback (if they really choose to take it). Still it locks up my entire computer for a few second for each reply, so even a few hundred users per second peak would be an expensive service.

    (Insert joke here about customers not noticing or caring about the difference between website chatbots built on big company services vs smaller ones, because they have exactly the same problems just in different hues.)