zongor [comrade/them, he/him]

  • 2 Posts
  • 53 Comments
Joined 4 years ago
cake
Cake day: September 2nd, 2020

help-circle
  • The header file was not originally made for the purpose it is used for today. In previous languages (like Fortran or COBOL) they had a preprocessor which was used for defining constants and macros and the like. The preprocessor is like a glorified cut and paste machine, it can’t do any complex processing by itself. (In fact the C preprocessor is not even Turing complete although it is close)

    The reason why the headers are included at the top is also for historical reasons. In single pass compilers a file is read line by line and parsed into an Abstract Syntax Tree; the function has to be declared before it can be used but sometimes it may be declared in a different file or later in the file. So it’s convenient to put that information in the header.

    Many modern languages use compilers that take multiple passes to generate the code. They will also use internal databases for the objects and their prototypes like a v-table to store data about the program to do optimizations and the like.

    Languages like rust, zig, and go use modules where they have namespaces where specific definitions of code are declared and able to be used later. They also had a series of built in tools like dynamically managing dependencies, linking, etc.

    For most languages they also have a Foreign Function Interface which allows them to call functions written in a different languages (like C shared libraries). All of the managers you mentioned have great FFI functionality and work well with C shared libraries. You can often use C header files in these since they give the function prototype without needing to read the whole source code and find all those definitions (often if the library is proprietary you will only have access to the shared library and the header files).



















  • Do disk management from windows/mac side first always. Windows does some weird stuff with their memory management (like unmovable files). So it’s easy to accidentally break stuff if you were to partition from Linux. I think Mac is slightly safer but they also have some weird stuff that’s a lot easier to use their partition tools to shrink and partition. Then during install you can rewrite the partitions file system to be more Linux friendly (e.g. ext4, btrfs, etc)

    As far as file sharing I have a separate disk that is FAT32 formatted but that’s just because I was hyper paranoid, NTFS works fine and has been in the Linux kernel for some time now. Mostly I did this so I could share the same steam library between my Linux and windows partitions so I didn’t have to install everything twice. External drives also work very well but I think it’s better for a back up rather than day to day. Also you can mount your windows partition directly in Linux so there’s really no need to have a 3rd partition for sharing


  • It kinda depends on what games you are using.

    If they are online only with anti cheat dual booting is the only viable solution because most anti cheat’s that don’t work with Linux/proton will flag you as cheating if you try to use a vm.

    If its some older game its prolly better to use a vm for that OS, lien a lot of old games for windows XP or windows 95 are like that. For really old ones you can just use dosbox which is very tried and true.

    If it’s just some random game that doesn’t work I either A: figure it will get working in some way eventually or B: give up on ever playing it again.

    I think I’m at the point where if a new game comes out and it didn’t work on Linux I just wouldn’t buy it. But I might be an outlier since most of the games I like usually get a Linux port or will work with proton anyways