Forums > The Library > Worlds.com
Oct 19th, 2022 at 4:25 am (edited)
#1147
I read some on the FAQ. WorldsPlayer only seems to support and detect a handful of older graphics cards, and defaults to the CPU for rendering otherwise. This is fine I suppose since computers are pretty fast now, but this makes Worlds an absolute CPU hog on it's single core, and causes easy FPS drops in busy rooms with custom models where there ought to be no performance dips.

Renderware 2 predates OpenGL or anything fancy like that. Well, okay, what does worlds support?

Unless you feel like buying an old Voodoo 3 and plugging it into your computer (you probably can do this, PC architecture has not really changed at all for like 40 some years), there doesn't seem to be a whole lot of options. Even then, using an older card has it's issues. You need drivers for your operating system, for one. You need Worlds to somehow detect that on your new OS with new APIs and multi GPU support, which is confusing to a program that is 30 years old. And even then, a GPU that old isn't going to perform much better than software rendering when people with 400,000 polygon custom avatars are sitting in the GroundZero. Not everyone feels like making an XP KVM either.

The real question is, what would it take to get modern 3D acceleration working with WorldsPlayer? Probably some kind of modding. I'm sure that our lack of understanding of the CMP format isn't really helping with that.

Could we just get a KVM and emulate a fake 2000s-era Glide card with the host GPU as an accelerator?
Oct 19th, 2022 at 10:42 pm
#1152
I'm pretty sure Worlds uses GDI for it's GPU accelerated rendering. I could be wrong though and it uses something else, but nothing I've found suggests an alternative.

As for actually getting it working, 1900+ versions won't work at all in any tests I've previously tried. 1900 might display a 3D support message if you get it to "detect" but it'll never properly render. The rendering detection is done in the native code, and I'm guessing it gets passed to Renderware from there.

Here is a nifty option: disableOpSysCheck=1
Seems to disable the actual checking of a supported hardware device. I haven't done too much investigating into it though.

I have no clue about faking GPU support in KVM, but if we have to resort to it, it is an option to grab a Voodoo card. Seems this guy figured it out using Windows 98: https://www.vogons.org/viewtopic.php?f=9&t=72401
Oct 19th, 2022 at 10:53 pm (edited)
#1153
I have no clue about faking GPU support in KVM, but if we have to resort to it, it is an option to grab a Voodoo card. Seems this guy figured it out using Windows 98: https://www.vogons.org/viewtopic.php?f=9&t=72401
I believe there's a Glide library somewhere in Worlds but I don't remember where I saw it. Maybe if I wrap that with dgVoodoo it will work, but this isn't a traditional game engine setup by any means so I don't really expect it to work right.

Basically I want my 1060 to somehow run the graphics of Worldsplayer directly. I don't think a physical GPU officially supported by Worlds will give great performance since many custom worlds and avatars made recently aren't made with old computers in mind. Seeing someone with an Ahnka avatar that's got 2 million tris on it is inevitable and it won't play nice with something that old.

The idea I had in mind is that we could create a fake Voodoo driver for a VM that passes calls onto the host GPU. Worlds will think it's a Voodoo but it will actually be a newer GeForce card. Brings the support I mentioned and eliminates the performance limitations. That's a huge ask, so I'm wondering if something like that exists already.
Feb 12th, 2023 at 6:29 am (edited)
#1468
This isn't quite what I had in mind, but I wanted to see what it would take to make Worlds actually detect a 3D accelerator.

I tried an 86box virtual machine, and tested with multiple graphics configurations supported by Worlds according to the FAQ on their site. I also used one of the Ultimate 3D Chat CDs to install from instead of a newer version like 1904.

After spawning in at Meteor and hearing that soundtrack kick in, I was disappointed to find that not only was my very expensive gaming PC struggling to emulate a measly 300 MHz Pentium II, but Worlds had no idea I even had the graphics card installed. I wasted time fiddling with and fixing drivers to get it working with Windows 98 SE beforehand, but no dice.

From what I can tell, I don't think GPU support in Worlds works hardly at all. I doubt I'll get anywhere without modding, let alone following the intended setup.
I should try that INI option next time around and see if that does anything.
Feb 12th, 2023 at 6:55 am
#1471
I forgot this thread existed.

I got it somewhat working (quite messily) in Linux WINE not too long ago, but only in an older Worlds version. I can't run any Worlds requiring Java 1.3, so I had to use 1830 since 1900+ doesn't have (working) hardware acceleration. I was able to use dgVoodoo2 with it's ddraw.dll and override it to be "Builtin, then Native", and then add "Native" ones for the d3dx.dll files. Then, you just run with DXVK to get Vulkan support. The result was... not ideal. If you have epilepsy, definitely don't do it. It seems the way Worlds renderer returns to the actual rendering process that it's finished rendering and what it has rendered doesn't cooperate with what WINE dxvk wants and requires, which means every frame redraws absolutely everything out-of-sync, so a lot of things flash around. Here is some things I noticed though:

- Portals render differently on HW Accel, clear by the fact they refused to render in this broken state.
- Worlds forces anti-aliasing on HW Accel with every texture (This is not a result of the graphics libraries used, I checked).
- There is absolutely zero culling on Worlds. Whatever you have in your view is to be rendered past any wall or boundary.
- Transparency looks beautiful when it actually works.

I definitely want to work on this more at some later point, potentially get whatever Worlds desires to work correctly so we can have a seamless experience. Let me know if you have a different experience than mine.
Feb 12th, 2023 at 7:43 am (edited)
#1472
On Windows 10 with version 1890, Worlds would detect the graphics card when first starting up (i added disableOpSysCheck=1 to be sure). But once acceleration was enabled, it began crashing any time it tried to render a frame to the window.


Version 1842 did not detect the graphics card and also complained I was low on memory.


The latest build of dgVoodoo was used in both tests. All 3 Glide DLLs were installed to both the root and bin folder of the Worlds install. Not sure if I did that right.

I have only JREs 6 and 8 officially installed on this system. (other JDKs as well but they aren't in the PATH).
I'll be rebooting into EndeavourOS to see if things turn out any different.

EDIT: The previous versions I tested will not run on Linux due to some WINE jank so I've reverted to 1804.
This one didn't even require me to install dgVoodoo or modify the INI, it just saw my NVIDIA GeForce GTX 1060 directly and told me I could use it.
Like 1842, 1804 complained my memory was too low. But when starting it with 3D acceleration on, it did not crash. Instead it also complained my video memory was too low and that my beefy 1060 literally couldn't handle the default window size. It appears to be running fine but the viewport is completely black. I have no idea if the renderer works properly or not, because something is preventing WorldsPlayer from seeing my full 16GB of RAM and 3GB of VRAM. Large amounts of memory seems to be a common issue for old software. No extra WINE dll settings were used.

On both Windows and Linux, messing with Direct3D did not do anything for me. Replacing DirectDraw on Windows completely broke the user interface and made it all black.
Joined09/12/21
Posts4
Feb 23rd, 2023 at 6:31 pm
#1496
I tried using 86Box, and was able to get 1890 to be hardware accelerated.

However, it was performing badly without it, and it felt like the acceleration was just bringing Worlds back to non-terrible framerates.
Feb 26th, 2023 at 9:23 pm
#1521
I tried using 86Box, and was able to get 1890 to be hardware accelerated.

However, it was performing badly without it, and it felt like the acceleration was just bringing Worlds back to non-terrible framerates.
Ultimately my goal here is to get the host's graphics card to do all of the calculations, if we use 86box we're only going to get the performance of older cards at best, and probably worse than that due to CPU bottlenecks caused by the emulation. I don't really expect 86box to run Worlds any better than the host's software rendering would.
Mar 1st, 2023 at 1:35 am
#1527
I recall finding a thread on VOGONS where somebody had made their own emulator for the Voodoo 3(?) card to work with Qemu. It used the host GPU so it wasn't as slow and reliant on CPU as 86box and PCem is.

Unfortunately, I can't seem to find the thread anymore. Looked in my history and searched around.
Aug 18th, 2023 at 8:14 pm (edited)
#1920
Whoever Wurscht1 is, thank you.
I'll be testing this when I get home from dinner
https://github.com/Wurscht1/WorldsPlayer-Forced-3D-Accelerator

EDIT: It's not entirely clear if the mod works or not. I did notice a few differences:
- Some textures seem blurrier at angles
- Some rects are brighter than they are supposed to be
- Z-ordering behaves differently and z-fighting is significantly more aggressive than on the software renderer, depending on the world.
- Polygons on rects jitter around a lot more than they do on software

GPU usage does increase with the mod but task manager credits all usage to dwm.exe. CPU usage does not decrease. Worlds runs poorly at high resolutions and infact seems to run slightly worse.

Software vs. Hardware side-by-side comparison:



I personally wouldn't recommend it.
Nov 19th, 2023 at 3:45 am
#2045
I recently discovered softgpu and tried it out. Turns out, WorldsPlayer recognizes the driver as compatible. It runs badly right now (Direct3D is a WIP), but it is very clearly hardware accelerated.

Here is a screenshot with WorldsPlayer 1890 on Kaito and a Java console. Not the most amazing demonstration, I know. Note how the transparent rects aren't using that grid-like pattern. Note the Java console.

Whoever Wurscht1 is, thank you.
I'll be testing this when I get home from dinner
https://github.com/Wurscht1/WorldsPlayer-Forced-3D-Accelerator
AFAIK there is more to actual hardware acceleration than values in the WorldsRenderer.dll. I'm pretty certain the Renderware libraries also have to play a role, or else it's just a weird mess of hardware techniques without the actual support. The blurry-ness is supposed to be anti-aliasing, since that is what Worlds does to all textures under HW-Accel.
Dec 8th, 2023 at 11:37 pm (edited)
#2065
I do find it strange that Worlds doesn't seem to have actual MSAA or supersampling on edges, but supports a sort of antialiasing on surfaces, like a 90s precursor to anisotropic filtering. I think it's the first time I've seen a technique like this in use. I'm curious how it works. Or maybe I'm overthinking it.

Texture mapping in Worlds is strange anyway, lots of texels being scaled up with a stair-step or sawtooth effect, etc. I don't know the math behind texture mapping well at all but maybe all the little quirks have something to do with how Renderware 2 uses affine mapping when in software mode.

Just things on my mind, I know it's not quite an answer.
Guest posting not allowed
Please log in to post a reply.