That's not how it works.One thread to handle audio. One thread to handle networking. One thread to handle the loading of dynamic objects. One thread for window management. How special would that be?
But is he talking about the client supporting software multi-threading or allowing hardware multi-threading? They are two different things.That's not what Pinco says, and he's a software engineer.
Beats me. All I know is the enhanced client makes extremely poor use of multiple cores and the central processing, not graphics processing, is the bottleneck with it.But is he talking about the client supporting software multi-threading or allowing hardware multi-threading? They are two different things.
That sounds like you mean it doesn't support multiple cores, which isn't exactly multi-threading. Hardware Multi-threading deals more with how a core or cores in a CPU hand data. CPU's have up and down cycles, hardware multi-threading means the software instructs the CPU to process another stream of data on the down cycle. Now if they Client is only making use of a single core that might be more of a design choice where they were trying to keep system requirements as minimal as possible. While I know almost everyone is most likely on a multi-core CPU at this point, I'm wondering if they were building the EC with people on Windows ME in mind or notBeats me. All I know is the enhanced client makes extremely poor use of multiple cores and the central processing, not graphics processing, is the bottleneck with it.
I work full time as a SE. There is no one thing that turns a "coder" into a "software engineer". Job titles are mostly meaningless in this field. Someone who has title of developer will likely be doing the same sort of work another person labeled a software eng is doing for another organization. They have software engineers regardless of their exact job title. This super talented computer genius that you want to come in and fix everything is a myth.There's a difference between a coder, even a very experienced and talented coder, and a software engineer. A software engineer is a coder, but the inverse is not necessarily true.
Well... yea. Formal education doesn't paint the entire picture, though. Someone can still be a software engineer without a degree. I have a degree from a 4 year university in CS with a specialty in SE and still say this. I'd imagine most of the devs have a CS or SE degree. It's not like the team needs some dude with a software engineering degree to come in and because he has this degree he can magically fix it. What your basically saying is "some people know more than others". Title still has nothing to do with this. Machine language/assembly and robotics is a specialty in a field, they will not be good at game dev just cause they do robotics, and I'd bet anything that the current dev team is way more qualified to work on the game than someone who is a specialist in machine language / robotics. There isn't exactly much overlap there.But I did think that the requirements to get a degree in "software engineering" were more rigorous than say getting certified in C++ or in a community college as a software developer. It's one thing to code mobile apps, it's quite another to work in machine language coding the software for the robotics in a factory, like someone I once knew did. They are both coders, but the latter is more than a coder, he's a software engineer.
Coder, software engineer, programmer, just different terms for the same thing.But I did think that the requirements to get a degree in "software engineering" were more rigorous than say getting certified in C++ or in a community college as a software developer. It's one thing to code mobile apps, it's quite another to work in machine language coding the software for the robotics in a factory, like someone I once knew did. They are both coders, but the latter is more than a coder, he's a software engineer.
My game runs pretty well honestly. Now I don't do many events or pvp with entire shard populations (24 players) on my screen, so maybe that is where you are running into issues? But for day to day stuff I really don't have any issues.Multicore support would make the EC, with or without PInco's, run much more smoothly.
Multicore support would make the EC, with or without PInco's, run much more smoothly.
Umm, who told you that? It wouldn't make any difference...Well, without Pinco's, I get around 30 - 80 fps, depending on the location, with Pinco's, I get 10 - 30 fps. ... But having said that, multithreading would make a huge difference.
Spot on! The EC is patchwork consisting of half a dozen code bases or even more. Least of them being thread-safe, couple of them not even maintained anymore (*wink* LuaPlus *wink*). It's not like there is only one codebase maintained by Broadsword that they can change at will.Also, I am pretty sure Gamebryo is a single thread engine.
It could be multithreaded if heavily modified, but it seems that they put most, if not all, game logic on LUA scripts... which are also not multithreaded.
You are right in a general sense, but the discussion is pointless on that broad scope. Please give us one example what is too slow in your opinion. Then we can reason if that's because of threading or other bad design choices.*lets that synch in*
I realize I'm talking about stuff I don't know much about, but it does seem to me you could still take load off the main thread by putting stuff like hid input, dynamic object loading, and networking functions on their own threads. What you seem to be saying is that multithreading doesn't work at all, so why bother with it? That is clearly not the case. The vast majority of games coded in the past decade are multithreaded, and many of them include LUA , and other languages, scripting functionality that runs on the main thread. And one would think that all those games derive some benefit from being multithreaded, or no one would bother going to the trouble.
So you can beat me over the head with your apparent expertise, but you clearly aren't presenting the entire picture.
Well first off, "game logic", really the clients are, for the most part, just UI's. Most of the actual "game logic" is server side, and what isn't, should be. The clients, both of them, have to facilitate and process input, interpret output from the server, and in the process render the game world and the UI. It is true that the Classic Client does all that without being designed with multiple processors in mind, but then, for the most part, the Enhanced Client does actually run better.Spot on! The EC is patchwork consisting of half a dozen code bases or even more. Least of them being thread-safe, couple of them not even maintained anymore (*wink* LuaPlus *wink*). It's not like there is only one codebase maintained by Broadsword that they can change at will.
Also, like noticed before, the EC is threaded. But then again like Lua can only be home in one thread. And when you overload that thread like there is no tomorrow, you get what you deserve.
Let's take the case of the UI freezing when opening public EM corpses. The core EC sends about 200 item updates to the Lua UI code in a split second. Each update invokes some Lua code, which in the sheer amount of updates/sec totally overloads that thread. And then because Lua has to run in the UI thread, it also overloads the screen updating, only getting an screen update out every couple of seconds (freeze!). That's bad design where multi-threading can only do so much. The better way would just to buffer those 200 events down to 1 every 100ms or the like. But then, there is ten more issues down the road we all didn't think of. So I restrain from callinig it easy
Anyway, I'm pretty sure most of the annoyances we see don't stem from a lack of multi-threading, but just some bad design choices when engineering the EC. Let alone Pinco's
That's not what I'm saying, that's just how you interpret it.What you seem to be saying is that multithreading doesn't work at all, so why bother with it?
Actually I did, but it didn't really seem to leave an impression:But, I'm pretty certain that no one has mentioned the fact that you have to 100% recode the entire client to get multi threading from a single thread coded game.
Depending on the achitecture of the client, this could very well mean to completely refactor the whole client. Basically starting from scratch in the worst case.
That's not what I'm saying, that's just how you interpret it.
Multithreading isn't the ultimate solution for everything. There's always a trade-off. MT might solve a lots of issues, but it also confronts you with other new issues. Therefore some applications are suited better for multithreading than others, and some less then others. Games usually fall into the latter category. But that doesn't mean that games can just be single-threaded.
Actually I did, but it didn't really seem to leave an impression:
Nevermind...Sorry, I didn't read through everything completely like I said. My general assumption for the population of the internet, is that they don't understand how coding & software engineering work. Unless it's Bealank, because I am pretty certain she's a Software Dev of some variety from conversation in gen chat on Legends. Always happy to see that people actually know how it works, but if you know how it works, you probably also understand my assumption that the general population just doesn't get it.
Get a bigger case with more fans! I have 18 case fans, never had issues with heat or screaming PC and I regularly sit down for like 6-8 hours on a Sunday to play some UO in EC.. I run 2x GTX1080, an i7 5960x, and have 0 liquid cooling, system never goes above 60 C, even in the 120 F summers in Texas where I have to keep my AC at 90 F or the bill is ridiculous.@Llewen - I also record and sometimes stream my gameplay with my multi-processor Alienware box using XSplit to HitBox.
I have basically "told" my vid/streaming software to use one processor, while UO/EC uses the primary. I forgot how I did it and because of that I have to be careful with Windows updates. It is also because of this that I do not like the EC too much, and play it minimally. Last time I seriously played the EC was.... uh.... dang, when I was still volunteering here at Stratics (couple years now I think).
To note, when I am live-streaming Diablo the system purrs like a loud kitten, but it does not overheat or have any game/system degradation. When I stream UO/CC, I rarely hear the fan. When I stream UO/EC, my system wants to scream so it is a very short lived session.
Am "poor student" living on part time wages while I chase after the MBA full time to get a "real job". heh... Otherwise I would have something quite similar to you, along with my dream Linux development box and a few servers for private hobbies (i love to write code & tinker in java).Get a bigger case with more fans! I have 18 case fans, never had issues with heat or screaming PC and I regularly sit down for like 6-8 hours on a Sunday to play some UO in EC.. I run 2x GTX1080, an i7 5960x, and have 0 liquid cooling, system never goes above 60 C, even in the 120 F summers in Texas where I have to keep my AC at 90 F or the bill is ridiculous.
I also wouldn't use Diablo as a basis for your system's strength/cooling power... it uses extremely low CPU and GPU % to run at max settings in 4k spread across 4 screens. Go stream any Paradox or Firaxis game (horribly unoptimized games that cap out core 0) and see if it gets like it does with EC, I can almost guarantee it will be far worse because I've done it in 1080 with 1 monitor, 1080 with 4 monitors, and 4k with 1 and 4 monitors. It's simply not cost effective to write a game in multithread format right now.
Also just a side note from a person who is friends with a few big youtubers/Twitch streamers, OBS is free and is far superior in it's optimization to that of XSplit which costs money and uses excessive amounts of CPU and GPU/Capture card power.
Edit because I forgot:
OBS does have a higher learning curve than XSplit to use it properly.
But we aren't talking about server-rooms/racks, nor processing machines....we are talking about personal computers lol. Which I am fully willing to admit mine are WAYY overboard for, which is why I like looking at others. I have over $9000 in my home computer, and somewhere between $7 and $8000 in my office computer. I spent $1500 on the 6950x processors just because I thought they were sooo cool. 10 Cores! 20 Processors! I had to have it. I just like looking at that stuff. If he comes back in and says he has these 18 fans in a server rack setup I will be much less in awe...because well....its the correct application. I like looking at, "YOU DID WHAT?! WHY IN GODS NAME?! Oh well yes that is pretty cool"@hungry4knowhow - I've seen some slick customs that are comparable to Herps... and then some. Machines that do a lot of rendering and compiling need to be like that. When Square was in Honolulu, I had the privilege of touring their server-room and production hall while they were coding for Final Fantasy (the movie). I drooled so much, they handed me a roll of paper towels!
Oh... not just servers...But we aren't talking about server-rooms/racks, nor processing machines....we are talking about personal computers lol. Which I am fully willing to admit mine are WAYY overboard for, which is why I like looking at others. I have over $9000 in my home computer, and somewhere between $7 and $8000 in my office computer. I spent $1500 on the 6950x processors just because I thought they were sooo cool. 10 Cores! 20 Processors! I had to have it. I just like looking at that stuff. If he comes back in and says he has these 18 fans in a server rack setup I will be much less in awe...because well....its the correct application. I like looking at, "YOU DID WHAT?! WHY IN GODS NAME?! Oh well yes that is pretty cool"