OS/2, eCS & ArcaOS - Technical > Setup & Installation

Well ArcaOS 5.0 is probably the last version for me.

<< < (3/13) > >>

Neil Waldhauer:
I think Darius asked the right question. Why do we need 64-bit?

The 2TB barrier for DASD can be overcome without it.
The 4 GB barrier for memory use has already been somewhat lifted with the new RAM drive.

But do not underestimate code bloat. At some point in the not-to-distant future, a simple web browser will not fit into the largest available OS/2 addressable space.

A good 64-bit API added to OS/2 could possibly make it the only 64-bit operating system that can execute compiled programs from the 1980s.

Martin Iturbide:
Hi

I like the theories about a OS/2 64bits kernel. Here it is the forum thread which everybody is free to continue.

OS/2 does not need a 64bits kernel, a 64bits kernel is needed if there will be a long term strategy for the platform when we want to use all the resources of new hardware. Maybe a dumb comparison will be to buy a new car, and never turn on the radio, wipers or lights. The basic use or the car is being applied (to mobilize people from one place to other), but you are not using the complete experience.

I prefer the idea of grabbing an new 64bits microkernel with market potential (Zircon?) and replace the OS/2 kernel with it. You grab the OSS kernel, create the OS/2 binary interpreter, the CPI API clone and try to run OS/2 interpreted over it (PM, SOM, WPS, Apps). Easy to say, but a lot of work and lot of little details need to be worked out.

Regards

Dave Yeo:

--- Quote from: Neil Waldhauer on May 26, 2021, 03:56:58 pm ---I think Darius asked the right question. Why do we need 64-bit?

The 2TB barrier for DASD can be overcome without it.
The 4 GB barrier for memory use has already been somewhat lifted with the new RAM drive.

But do not underestimate code bloat. At some point in the not-to-distant future, a simple web browser will not fit into the largest available OS/2 addressable space.

--- End quote ---

Code bloat is one of the big things. Developers generally have powerful machines and that is what they target. For commercial software, the managers want fast development, not slow optimized development. And for open source, once again the developers likely have powerful machines with lots of ram and a lack of motivation in general for optimization.
Otherwise, browsers are going to use more memory, large canvases, JavaScript engines doing more and more and the big one, sandboxing. One weak point in all operating systems is browser based malware, as we do more and more in the browser, it becomes more important to stop one tab from spying on another or otherwise affecting it.
Then there's also building this stuff. Building Mozilla for quite a while was close to the limit, I'd see compiling one file taking up over a GB of memory and linking using the whole address space. This has got worse with the QT web stuff. While still build-able, care has to be taken. Instead of taking advantage of all cores by having multiple jobs compiling, only one which slows things down to the point where a recompile can take most of a day.
Manipulating images and especially videos are another area. Cameras get more pixels creating bigger images and videos get bigger and use more memory intense compression.
Sure there's other things as well.
While software could be written to take advantage of the ram above 4GBs, it is non-trivial and unlikely to happen in a big way.


--- Quote ---A good 64-bit API added to OS/2 could possibly make it the only 64-bit operating system that can execute compiled programs from the 1980s.

--- End quote ---

The way 64 bit mode works is incompatible with 16 bit software. OS/2 has too much 16 bit software internally. Programs from the '80's are usually 16 bit. There's a reason that even 64 bit Windows can't run old code natively, and that's the design.

Roderick Klein:

--- Quote from: Neil Waldhauer on May 26, 2021, 03:56:58 pm ---I think Darius asked the right question. Why do we need 64-bit?

The 2TB barrier for DASD can be overcome without it.
The 4 GB barrier for memory use has already been somewhat lifted with the new RAM drive.

But do not underestimate code bloat. At some point in the not-to-distant future, a simple web browser will not fit into the largest available OS/2 addressable space.

A good 64-bit API added to OS/2 could possibly make it the only 64-bit operating system that can execute compiled programs from the 1980s.

--- End quote ---

On large disc bigger then 2 TB GPT is supported. So tha is off the table it seems that issue.

Roderick

Martin Iturbide:
Hi

I remember that on the early 2000 I was expecting to see a change similar in software from 16bits to 32bits when Intel announced the 64bit Itanium. At the end Itanium was incompatible with all the 32bits software (it was a too aggressive change of instruction set), the processor flop, and the 64 bits adoption on PCs took a slower path. AMD created the x86-64 instruction set (2003) which was retro-compatible with 16 and 32 bits, and the adoption took that path. I think that by 2010 the 64bits adoption PC was 50% (with Windows 7).
The only improvement I saw from 32bits to 64bits was the adoption of better video edition, playback and compression, maybe because of having more memory access.

The "2TB barrier DASD"  and "4 GB memory barrier" are issues of the present. For a 64bits kernel you need to think on the future, how a 64bits kernel can improve the OS experience? what extra things can be added to ArcaOS with 64bits support?
If the goal is to have the same old comfortable thing on new hardware, without any expectation of future improvement for the platform, there is no case for 64bits.

If someone wants a 64bits kernel for OS/2, it is because they want the OS to evolve, get more relevance and be able to get the full resources that the hardware provide.

Regards

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version