Memory, Hardware Acceleration and Strand Shaders, Window Size, Quality Setting (1 Viewer)

MrMetaldong

Content Creator
Joined
Jan 24, 2016
It doesn't seem that SDT is able to take advantage of a super duper GPU very effectively, but a few things I've noticed seem to indicate that, to a certain, minimal degree, it's able to.

The first thing is strand shaders. Turning them on or off has no discernible impact on performance for me, no matter how many are on-screen. This leads me to believe that the game has managed to get my GPU to take care of them.

However, it would seem that rendering the game in a larger window poses problems. For some reason, it looks like the game scales resolution with window size, and while it would make sense to see a performance drop if render time was adding significantly to frame time, I don't see any reason for the amount of time it takes to actually draw the frame to waver the slightest bit if this task, one normally allocated to the GPU, were being allocated to the GPU. The marked decrease in performance upon enlarging the window has convinced me that somehow, an increased resolution is causing CPU tasks to take longer (???). The same can be said for the Quality setting. I'm not entirely sure what this setting does, but the appearance difference reminds me of turning AA on or off. It wouldn't make sense for a simple increase in graphical quality of an application like SDT to pose a problem for any modern GPU, were it actually being handled by the GPU.

So I opened MSI afterburner and tested. Nothing changed GPU usage in the slightest except for window size. Playing the game, GPU usage stayed at 3-5 percent. Making the game fullscreen changed this to 8-10 percent. Given the abysmal frame rate that comes with checking Show All and fullscreening the window, I don't think this ~5% increase in CPU usage had anything to do with hardware acceleration offloading work to the GPU the way I generally expect hardware acceleration to; rather, I think that's just what happens when the stuff taking up the screen stops being task manager and a web page and starts being a flash application. The difference in GPU usage between Low quality with strand shaders off, and High quality with strand shaders on, in the same size window, was exactly nothing. It's a bit difficult for me to tell if this means that the GPU isn't being told to do anything, or if it's being told to do maybe 1 or 2 things (I still think it's doing strand shaders), but the load of that task is immeasurably small.

After that, I decided to test CPU load. Testing with Coretemp, I saw one core's load increase from 60-70% at Low quality, 100% view, and strand shaders off, to 80-90% at High Quality, Show All in fullscreen, with strand shaders on. The other 7 cores also seemed to be experiencing greater loads of rapidly fluctuating size, though I doubt those numbers were relevant because those cores were most-likely handling other tasks. I don't believe flash player features multithreading before 11.4.something anyway. It's really annoying how the only version information flash player displays is "11", but I have a feeling that since the version included in the loader pack by sby (the version I use) was chosen for performance, it may very well be post 11.4.

Task Manager's CPU usage number may give a clearer image. Continuously ejaculating on a character's face with some mods loaded at 100% view in low quality with strand shaders off saw numbers of 18-23% usage. With quality set to high and strand shaders on, in the same test, I saw 25-30%. Changing to show all in fullscreen with high quality and strand shaders, this became 27-33%. I was unable to see significant change in CPU usage taking any configuration and changing strand shaders alone. Similarly, I could not observe any change in framerate by toggling strand shaders on or off. Given that some users experience slowdowns with the option enabled, and I do not, I am convinced that strand shaders, at least, are being handled by my GPU.

The other thing I noticed is that often, the longer SDT is played, the worse performance seems to get, even if strands, mascara, and whatnot are cleared. To test I monitored the memory being used by flash player as reported by task manager, leaving auto mode on hard for 15 minutes or so. The game was using 881,000 KB for the majority of this test, and that number stayed constant for about the entire second half of the test, in contrast to the second test I did, detailed below. There was a flat increase in memory usage of a few tens of megabytes when window size was increased, but there was predictably no change as quality setting was changed, and no change as strand shaders were turned off or on. I noticed framerates of 15-17 at the end of the test, with strands on the screen, strand shaders on, quality set to high, and view size at 100%.

Restarting the game and returning to the same character, I noticed a memory usage of 840,000 KB, and framerate of 23-24. After a minute or so, the memory usage increased to 848,000KB and the framerate decreased to a steady 20-21. During the first ejaculation, memory had increased to 854,00KB and framerate decreased to 16-17. After that the framerate climbed back up to 21-22 as the strands all finished fading, but memory continued to slowly rise. After about 25 minutes of this memory had climbed considerably to 889KB and continued to see small megabyte by megabyte increases every so often. Framerates in general, with many strands on screen and few strands on screen, seemed 1-2 fewer per second. After 5 more minutes or so, the game finally broke 900,000KB of memory usage and continued to climb, being left on auto-mode. By this time framerates mirrored those of the first test; 15-17, with strands on the screen, strand shaders on, quality set to high, and view size at 100%.

In summary:
When you turn hardware acceleration on, SDT only seems to be able to use your graphics card for strand shaders; everything else effects CPU usage. I even turned hardware acceleration off and came all over the girl. It was hard to notice much of a difference, but framerates seemed maybe 1 lower on average. This is so small that I wonder if the game truly does use hardware acceleration for anything at all. The game also consumes more RAM the longer you leave it open, and this number could easily exceed a gigabyte depending on the number of things you load in INIT or while playing. The only game setting that effects RAM usage is the size of the window if you have "Show All" checked under the View tab. Quality and window size both effect CPU usage, although strand shaders doesn't appear to.

I'd like to test all this again once that new Skylake i7 gets cheaper and DDR4 RAM gets fast enough to get warrant getting an LGA 1511 mobo and upgrading this budget CPU and old RAM. The clock speeds are the same, but I'd like to see if the superior architecture on the 6th generation intels do anything for SDT.


Relevent System Specs:
Windows 7
AMD FX 8350 Black Edition 4.00GHz (no overclock)
1x MSI 6G GTX 980 ti
12GB DDR3 RAM (Sorry I can't remember the frequencies, but they aren't that high. Just old sticks I found lying around the office. I'd also mention that in my tests flash player was allowed to use unlimited memory)
I have most of the mods from sby's loader pack 5 in my init folder, and I've added a shitton of hair and backgrounds for moreclothingmods to load on startup, so my RAM usage is probably on the high side.
 

Users who are viewing this thread

Top


Are you 18 or older?

This website requires you to be 18 years of age or older. Please verify your age to view the content, or click Exit to leave.