...Microsoft's private development tools? Microsoft uses Visual Studio and MASM like the rest of us lowlifes, mang.
what did I say?
I know what Cygwin is.
As long as you aren't using inline assembly (because Microsoft and Intel use the Intel syntax, while GCC uses AT&T syntax) your code will be cross-platform. This is why standards exist.
I would also like to definitively state that it'll be easier to port a well-written standards-conscious app to the GNU toolchain after completing it with Visual Studio, since hitting Ctrl+Shift+A to add a new source file is decidedly more convenient than having to crack open your makefile in vim.
See above.
Also, GCC is far from the only compiler that uses configurable linking options and optimization levels, and the majority of the command line switches are even shared amongst GCC, ICC and CL. This is to the extent that you can even use ICC and CL as a drop-in replacement for GCC (which is useful if you wanted to, for instance, use DISTCC with a Visual Studio project).
Yeah, improve your code by getting rid of them. I don't know where you got the idea that preprocessor macros are good but they aren't. They make your code more complex, less readable, less
machine-readable (which is important for things like code completion), they don't offer invocation-time type safety (go go BIGINTEGER(x, y) macro that lets you mash together two floats just as easily as two integers) and... as I have already said... C++ already gives you better, more readable options.
Also, the C# preprocessor is not like the C preprocessor. You cannot write macros with C#. It isn't even technically a preprocessor - directives are handled during tokenization.
#using is a compiler hint (to tell it what namespaces to look under)
#region/#endregion is used by Visual Studio for readability.
#warning, #error - again, used by the compiler to emit a specific error
#line forces the compiler to use a specified line number and code file (code generation programs like parser generators would use this)
#define, #undef, #if, #elif, #else, #endif, etc. - closest it gets to C++, and they're still only used for things like "#ifdef DEBUG" and "#if XBOX".
As far as I know, only GCJ can produce portable Java bytecode. Optionally. I guess there's no real reason a GCSharp compiler couldn't produce either GCC IL or CIL but that would defeat the purpose, wouldn't it?
Yeah, it's a pity you'd be stuck with Visual Studio's debugger seeing as how it only manages to be as feature-rich and capable as GDB, with support for edit and continue, and is only integrated directly into the IDE so you can visually step through a program while looking at your original code without having to use a command line tool.
You'd recommend Python for Windows game development? If you're going to recommend it for game development at least be consistent, because Python is just as good on Linux as it is on Windows and that'd be its main advantage.
Welp, this programmer has never existed.
IIRC, the first documented buffer overflow exploit was on UNIX. Old versions of UNIX had a 128 character limit on filenames (not including null terminator) while the file manipulation functions in the old LIBC used a fixed 128 character buffer. Not even the people who
invented the idea fit the description you gave.
Software programming is a massive discipline and there is specialization. You can't be good at everything. If you want to be an OS programmer then you might be worried about interrupts (what you probably meant when you said "system call"), but knowing how to write bare metal code or switch into protected mode or write a scheduler doesn't mean you're going to be sod all good at writing a game or a text editor.