Long story short, I'm going to college for animation, had to get a mac book pro (which I hate). and every time i bring up why macs are bad my sister always chimes in with "adobe products run better on macs."
Is there any truth behind this? Or is it just one of those stupid misconceptions like Mac's don't get virus'
There are certain things that are different, and can run differently. The Mac OS is actually pretty good. From my experience, most people in the computer development world like Unix based systems (Linux, Mac, etc) more than Windows.
I don't imagine you'll run into too many differences. There are some programs that format things differently whether on Mac or Windows, and you'll need to be in line with whatever the school has. Historically, there were certain products (mainly those in animation, design, video editting, etc) that were targeted for the Mac OS. Nowadays, it's not really the case. One program that does come to mind that is a lot better on Mac is iTunes, but that's because Apple.
There are a lot of more technical reasons, but there really shouldn't be a problem for you.
Anecdotal I know, but at every computer event I've been to, almost everyone has had Macs and prefers them. I know only a few computer developer/sysadmins who prefer Windows to Mac/Linux for general work.
I know I love mine because it actually runs the software I need without having to jump through hoops to configure it all on Windows.
Ha ha. No. Running most software on modern operating systems as large as OS X and Windows do not require you to "jump through hoops to configure it all" anymore. And it hasn't for a long time. In almost every case it's as simple as clicking on install on windows, and dragging an app into the app folder on OS X. Though I suppose you actually have the option to configure things if you'd like on Windows.
There's no longer a gap between OS X and Windows when it comes to "general work" and there hasn't been for many years. It used to be that OS X was good for productivity tasks but it no longer is. Mostly everything works the same, if not better on Windows.
Windows not being POSIX compatible is an absolute nightmare for doing half of what I do. So much software isn't even built for it, or can't be built for it.
It's crazy I'd need to install a bunch of third party binaries or download something as massive as Visual Studio to build things (that require tons of porting to even work, and often aren't tested or are buggy) to do what I think of basic.
Also, for what it's worth, I use Windows, OS X, and Linux on a daily basis.
Windows not being POSIX compatible is an absolute nightmare for doing half of what I do. So much software isn't even built for it, or can't be built for it.
It's crazy I'd need to install a bunch of third party binaries or download something as massive as Visual Studio to build things (that require tons of porting to even work, and often aren't tested or are buggy) to do what I think of basic.
Also, for what it's worth, I use Windows, OS X, and Linux on a daily basis.
Never had that problem downloading Visual Studios. I only used the 2010 and 2008 versions.
For me, most the software I use is available on most OS's. I stay on windows because of how easy it is to install new hardware and the fact that installing stuff is really easy. I haven't used Mac OS stuff really, but on Linux, I had to search how to install Java and some other programs I used. Windows you just download an installer and click next a bunch of times.
What OS should be your choice, not mine, not anyone else (expect maybe you jobs). If you prefer OS X over Windows, that is your choice.
Never had that problem downloading Visual Studios. I only used the 2010 and 2008 versions.
For me, most the software I use is available on most OS's. I stay on windows because of how easy it is to install new hardware and the fact that installing stuff is really easy. I haven't used Mac OS stuff really, but on Linux, I had to search how to install Java and some other programs I used. Windows you just download an installer and click next a bunch of times.
What OS should be your choice, not mine, not anyone else (expect maybe you jobs). If you prefer OS X over Windows, that is your choice.
Assuming you're running a nice Linux distro *cough*Arch Linux*cough* you can install literally anything from the AUR without having to even grab installers with just a single command!
I'm certainly not saying that everyone should use one OS or another because of what I need, everyone has different requirements for what their compute should do. That's why I still use Windows and Linux! I was just replying to another comment talking about people in the computer world preferring OS X.
Anecdotal I know, but at every computer event I've been to, almost everyone has had Macs and prefers them. I know only a few computer developer/sysadmins who prefer Windows to Mac/Linux for general work.
Anecdotal indeed, and useless to know when many if not most people (myself included) will have had the opposite experience.
There's no longer a gap between OS X and Windows when it comes to "general work" and there hasn't been for many years. It used to be that OS X was good for productivity tasks but it no longer is. Mostly everything works the same, if not better on Windows.
Well, it really depends what you're doing. For example, audio production is still somewhat of a Windows-centric field, although that's gradually becoming less the case as more major software is being ported to OS X (but good luck doing trying to do any serious audio production on an other OS). More multi-platform software is being made, and less Windows-only software, but there's still a disparity to be solved.
This is just an opinion due to people talking over OS's. Everyone has their own opinions and personal needs. These are just my personal thoughts.
Mac OS X is actually a lot easier to use than something such as Windows 8. Linux not so much due to drivers and installing some things, as it can be buggy. I use Linux as a main OS though as I believe once Vulkan releases, everything should be
quite a bit more efficient, and it supports all my needs. Macs are a little overpriced, but even though they don't have the best specs, they're very well-built and a solid laptop. Most gaming laptops have quite a few heating issues and the such.
I have to say though. The 5K iMac is far from overpriced.
I'm not flaming as I stated before.
Rollback Post to RevisionRollBack
• CPU: Intel Core i3 4150 @3.5GHz
• Mobo: ASRock H97M Anniversary
• Ram: Kingston Fury HyperX Black 8GB(2x4) @1866MHz
Windows is POSIX compatible. It has a complete POSIX subsystem, in fact. It isn't POSIX certified, and certain operations- such as Fork, simply have no equivalent in the Windows world so the Fork() function get's mapped to an exec() function.
The problem is less about limited POSIX compliance, and more that most software is not POSIX compatible to begin with, and that, furthermore, that not being POSIX compatible is practically a requirement for making software that isn't absolute garbage. POSIX was more a set of standards to try to get some consistency among UNIX systems at the time. It assumes a UNIX system and therefore it assumes UNIX methodologies, which do not port to other operating systems, making the introduction of full POSIX compliance to other Operating Systems pretty much impossible, since it was created Post-facto by the United States Government to describe a set of loosely competing UNIX systems to ease procurement requirements.
is an absolute nightmare for doing half of what I do. So much software isn't even built for it, or can't be built for it.
Well of course. If what you do is steeped in *nixisms, such as the configure/make/install cycle, that is going to be annoying or a nightmare. It is not impossible, though- you could use tools like cygwin- but that is merely providing you with the *nixisms you are familiar with- like providing a familiar teddy bear in a place you are completely unfamiliar.
Any piece of software could be ported to Windows or vice versa (excepting of course those that are specific to features of each OS). The problem is more that typically Software doesn't jump the divide.
None of the software I work on works on Linux. It would be 100% possible to get it working on Linux- however, it hasn't been determined to be worth investing in the time to make it compatible- it would also mean losing a lot of features or making a lot of features Windows-centric, particularly since a lot of capabilities are created using advanced p/Invoke. As a specific example, I added the capability to have a Windows Forms Listview sorted generically. Some of it would work via Mono- which provides a Winforms-supported interface to the desktop environment. However I also use various Windows API calls in order to add advanced features. However, there is no Linux equivalent that could be used from Mono, so such features would need to conditionally compile or determine that the system is a Linux system. This would also require eliminating or changing features that interact with the Windows Service Manager or the Task Scheduler to instead use the Unix Services and Crontab; Possible, but not at all trivial and a development investment that is hard to justify given that no customers have ever expressed interest in running our software in Linux. (We barely convinced some of them to move to Windows from a VAX-VMS-derived system).
It's crazy I'd need to install a bunch of third party binaries or download something as massive as Visual Studio to build things (that require tons of porting to even work, and often aren't tested or are buggy) to do what I think of basic.
Given that Windows IS POSIX compliant, it seems the core of your complaint here is "it works different". Well, I mean, fair enough. That's basically what any argument I would have against development on Linux would boil down to anyway.
Take your first sentence. You are saying it is rather crazy that you need to do extra work to compile source code into binaries.
Your average user doesn't build software regardless of their system; Windows programs are distributed as binaries and installed; on *Nix, you might be able to find binaries but generally the approach is that you download a source tarball and run the makefile. (And in my experience, spend about 30 minutes trying to figure out dependencies, though I recently got lucky with the HP Laserjet printer support for CUPS (CUPS itself being a while other ball of wax...) and only needed to add a few.
On Windows you can still use GCC. If you are referring to being able to build Windows-specific software on Windows systems, you could also use the .NET Framework SDK, which includes all the build tools.
One of the paradigm differences between Windows development and *Nix development is that on *nix, everything is CLI-based, and it is expected that your software has a make file which is really just a glorified shell script more or less (or a shell script that invokes a make file or something). Software often doesn't have installers; instead it is assumed you know what to do with a source tarball and what to do when errors occur. (This is arguably unfortunate for your average user who just wants to try the beta version of something).
Software requiring porting to work is hardly Windows specific. Most Linux software needs to make makefile changes to support OSX, and occasionally even changes to the dependencies. That a project is not tested or verified or designed for Windows is not a fault of Windows, it would be a fault of the project. But don't try to complain to the project, because they will typically just tell you to do it yourself and file a pull request. The developer mindset is much different between the two systems.
On Windows- Developers work with source code and compile programs. users don't. They run and use software. They don't need compilers or Software Developer Kits.
On *nix systems, the entire system is more or less designed around the customization that is possible by writing C programs, and writing your own programs. Most systems come with a lot of pre-installed developer tools. These tools are themselves useless to your typical user; it is that the installation process for software tends to compile from source when installing packages that lends itself to these tools being omnipresent.
Most development for OSX software occurs in XCode because realistically XCode is the only way to create native OSX Applications. In fact, from what I can tell, you cannot run software for Linux on OSX. If you are lucky you might be able to compile and run it, but that seems to be the exception rather than the norm. Given the same core designs and tools it is probably easier to get a software product designed only for Linux to run on OSX, though.
On Linux I find development of any semi-serious software product to be a nightmare. oftentimes even using a product can be a time-wasting endeavour.
Some time ago, I upgraded a system I had from Linux Mint 10 to Linux Mint 12. On Linux Mint 10, I was using a piece of software called "Desktop Drapes" in order to have a slideshow feature on the system. It worked.... it didn't work on Linux Mint 12. Searching about it it was because the Desktop Environment was changed. I searched on the issue further and I found that there was a plugin for Drapes to add support for Gnome 3. So I installed it. Drapes started and claimed to be running but the desktop wallpaper never changed. There were no logs to read. Searching further, I found there was a patch available for the plugin that fixed a problem with the plugin. Unfortunately, after trying it, I then discovered that the patch itself didn't work, and I needed another patch to apply to the first patch before I applied the patched patch to the plugin.
At that point, I decided not to use Desktop Drapes, and ended up spending a weekend constructing a python script to do the job manually by calling the gsettings program.
Does that same script work on later versions of Linux Mint? I don't think so. I'd be surprised if either Cinnamon or Mate had gsettings and it worked similarly.
Realistically the greatest obstacle to writing software on Linux is simply because of how much different other software could be running on the system providing capabilities you need. With Windows- you know a lot of stuff that is running, so you don't need to consider "well, what if they have a different registry handler" or "What if their desktop environment doesn't support per-pixel alpha-blending of device independent bitmaps?" That sort of thing. With a Linux application in order to be robust and run on more distributions you either need to restrict what your program does- keep it in the CLI, primarily- or you need to support the different cores that each distribution may have for the added features you want.
Really they just work different. Linux is more for the type of person who wants to customize their system a lot, and doesn't mind doing a lot of research, writing a lot of scripts or submitting pull requests or running patches to try to get that feature. And I won't deny it's pretty cool to have your own specially configured system after doing so (which is why I was writing a few scripts at the time). But it's not better or worse. For me I prefer the Windows way because when I'm not working I prefer not to need to screw around with other people's source code too much, unless that is part of my goal, nor do I find that Open Source has any specific value to me as a user since even though I'm a programmer I don't have the time to go and audit every single piece of software I run. Different strengths for both, resulting in people with different preferences.
As to the topic? I think it has a LOT more to do with the development of the software you run, and far less to do with the OS. Its sort of the same deal as porting a Console game to PC and having it run terribly; nobody says "ah, that is because the PC is a terrible platform"; they rightly know that it is the fault of those who ported or constructed the port of the game to the PC. Same applies between different operating systems. Trying to use "*nixisms" in a Windows environment during development can cause performance problems. Assuming certain things work a certain way because they worked that way on the OS you are familiar with can lead to difficult to trace performance issues or bugs. That sort of thing. If a piece of software runs worse on one operating system versus another, it is not the fault of the Operating system, but generally the fault of those responsible for the software. There are some exceptions I suppose but performance wise such disparate performance between the same software on two systems is likely a result of disparate effort put into the software running on those two systems, and not caused by one of the Operating Systems being more or less 'efficient'.
Long story short, I'm going to college for animation, had to get a mac book pro (which I hate). and every time i bring up why macs are bad my sister always chimes in with "adobe products run better on macs."
Is there any truth behind this? Or is it just one of those stupid misconceptions like Mac's don't get virus'
There are certain things that are different, and can run differently. The Mac OS is actually pretty good. From my experience, most people in the computer development world like Unix based systems (Linux, Mac, etc) more than Windows.
I don't imagine you'll run into too many differences. There are some programs that format things differently whether on Mac or Windows, and you'll need to be in line with whatever the school has. Historically, there were certain products (mainly those in animation, design, video editting, etc) that were targeted for the Mac OS. Nowadays, it's not really the case. One program that does come to mind that is a lot better on Mac is iTunes, but that's because Apple.
There are a lot of more technical reasons, but there really shouldn't be a problem for you.
"Programmers never repeat themselves. They loop."
https://en.wikipedia.org/wiki/Wikipedia:Citation_needed
Anecdotal I know, but at every computer event I've been to, almost everyone has had Macs and prefers them. I know only a few computer developer/sysadmins who prefer Windows to Mac/Linux for general work.
I know I love mine because it actually runs the software I need without having to jump through hoops to configure it all on Windows.
Windows not being POSIX compatible is an absolute nightmare for doing half of what I do. So much software isn't even built for it, or can't be built for it.
It's crazy I'd need to install a bunch of third party binaries or download something as massive as Visual Studio to build things (that require tons of porting to even work, and often aren't tested or are buggy) to do what I think of basic.
Also, for what it's worth, I use Windows, OS X, and Linux on a daily basis.
Never had that problem downloading Visual Studios. I only used the 2010 and 2008 versions.
For me, most the software I use is available on most OS's. I stay on windows because of how easy it is to install new hardware and the fact that installing stuff is really easy. I haven't used Mac OS stuff really, but on Linux, I had to search how to install Java and some other programs I used. Windows you just download an installer and click next a bunch of times.
What OS should be your choice, not mine, not anyone else (expect maybe you jobs). If you prefer OS X over Windows, that is your choice.
Check out my Animation Channel: https://www.youtube.com/channel/UCVy57y58kWOd6zRYJNGQkNg/feed
Assuming you're running a nice Linux distro *cough*Arch Linux*cough* you can install literally anything from the AUR without having to even grab installers with just a single command!
I'm certainly not saying that everyone should use one OS or another because of what I need, everyone has different requirements for what their compute should do. That's why I still use Windows and Linux! I was just replying to another comment talking about people in the computer world preferring OS X.
Anecdotal indeed, and useless to know when many if not most people (myself included) will have had the opposite experience.
Well, it really depends what you're doing. For example, audio production is still somewhat of a Windows-centric field, although that's gradually becoming less the case as more major software is being ported to OS X (but good luck doing trying to do any serious audio production on an other OS). More multi-platform software is being made, and less Windows-only software, but there's still a disparity to be solved.
I fixed my statement, to better clarify it's anecdotal.
"Programmers never repeat themselves. They loop."
This is just an opinion due to people talking over OS's. Everyone has their own opinions and personal needs. These are just my personal thoughts.
Mac OS X is actually a lot easier to use than something such as Windows 8. Linux not so much due to drivers and installing some things, as it can be buggy. I use Linux as a main OS though as I believe once Vulkan releases, everything should be
quite a bit more efficient, and it supports all my needs. Macs are a little overpriced, but even though they don't have the best specs, they're very well-built and a solid laptop. Most gaming laptops have quite a few heating issues and the such.
I have to say though. The 5K iMac is far from overpriced.
I'm not flaming as I stated before.
• CPU: Intel Core i3 4150 @3.5GHz
• Mobo: ASRock H97M Anniversary
• Ram: Kingston Fury HyperX Black 8GB(2x4) @1866MHz
• GPU: Gigabyte GeForce GTX 960 G1 Gaming 2GB GDDR5
• PSU: EVGA 500W 80+ Certified
• HDD: Western Digital Caviar Blue 1TB
• HDD: Western Digital Caviar Blue 160GB
• Case: Lian Li PC-50
• Monitor: Acer P221w 22" 1680x1050 60Hz
• Headset: Kingston Fury HyperX Clouds
• Mouse: Razer Deathadder 3.5G 3500DPI
• Keyboard: Cooler Master Storm Quickfire Rapid w/ Cherry MX Blues
I'll have to check the prices again, but compared to the dell 5K monitor, the iMac was properly a better deal.
Also Vulkan will be nice, and Direct X 12. Need my Cpu to last till Skylake or Zen.
Check out my Animation Channel: https://www.youtube.com/channel/UCVy57y58kWOd6zRYJNGQkNg/feed
Well yeah, anything is easier than Windows 8.
Classic Shell makes Windows 8 much more tolerable.
iTunes and Safari. Those things run like a slug on my computer. It's soo slow it's not even funny.
- C.C.
Windows is POSIX compatible. It has a complete POSIX subsystem, in fact. It isn't POSIX certified, and certain operations- such as Fork, simply have no equivalent in the Windows world so the Fork() function get's mapped to an exec() function.
The problem is less about limited POSIX compliance, and more that most software is not POSIX compatible to begin with, and that, furthermore, that not being POSIX compatible is practically a requirement for making software that isn't absolute garbage. POSIX was more a set of standards to try to get some consistency among UNIX systems at the time. It assumes a UNIX system and therefore it assumes UNIX methodologies, which do not port to other operating systems, making the introduction of full POSIX compliance to other Operating Systems pretty much impossible, since it was created Post-facto by the United States Government to describe a set of loosely competing UNIX systems to ease procurement requirements.
Well of course. If what you do is steeped in *nixisms, such as the configure/make/install cycle, that is going to be annoying or a nightmare. It is not impossible, though- you could use tools like cygwin- but that is merely providing you with the *nixisms you are familiar with- like providing a familiar teddy bear in a place you are completely unfamiliar.
Any piece of software could be ported to Windows or vice versa (excepting of course those that are specific to features of each OS). The problem is more that typically Software doesn't jump the divide.
None of the software I work on works on Linux. It would be 100% possible to get it working on Linux- however, it hasn't been determined to be worth investing in the time to make it compatible- it would also mean losing a lot of features or making a lot of features Windows-centric, particularly since a lot of capabilities are created using advanced p/Invoke. As a specific example, I added the capability to have a Windows Forms Listview sorted generically. Some of it would work via Mono- which provides a Winforms-supported interface to the desktop environment. However I also use various Windows API calls in order to add advanced features. However, there is no Linux equivalent that could be used from Mono, so such features would need to conditionally compile or determine that the system is a Linux system. This would also require eliminating or changing features that interact with the Windows Service Manager or the Task Scheduler to instead use the Unix Services and Crontab; Possible, but not at all trivial and a development investment that is hard to justify given that no customers have ever expressed interest in running our software in Linux. (We barely convinced some of them to move to Windows from a VAX-VMS-derived system).
Given that Windows IS POSIX compliant, it seems the core of your complaint here is "it works different". Well, I mean, fair enough. That's basically what any argument I would have against development on Linux would boil down to anyway.
Take your first sentence. You are saying it is rather crazy that you need to do extra work to compile source code into binaries.
Your average user doesn't build software regardless of their system; Windows programs are distributed as binaries and installed; on *Nix, you might be able to find binaries but generally the approach is that you download a source tarball and run the makefile. (And in my experience, spend about 30 minutes trying to figure out dependencies, though I recently got lucky with the HP Laserjet printer support for CUPS (CUPS itself being a while other ball of wax...) and only needed to add a few.
On Windows you can still use GCC. If you are referring to being able to build Windows-specific software on Windows systems, you could also use the .NET Framework SDK, which includes all the build tools.
One of the paradigm differences between Windows development and *Nix development is that on *nix, everything is CLI-based, and it is expected that your software has a make file which is really just a glorified shell script more or less (or a shell script that invokes a make file or something). Software often doesn't have installers; instead it is assumed you know what to do with a source tarball and what to do when errors occur. (This is arguably unfortunate for your average user who just wants to try the beta version of something).
Software requiring porting to work is hardly Windows specific. Most Linux software needs to make makefile changes to support OSX, and occasionally even changes to the dependencies. That a project is not tested or verified or designed for Windows is not a fault of Windows, it would be a fault of the project. But don't try to complain to the project, because they will typically just tell you to do it yourself and file a pull request. The developer mindset is much different between the two systems.
On Windows- Developers work with source code and compile programs. users don't. They run and use software. They don't need compilers or Software Developer Kits.
On *nix systems, the entire system is more or less designed around the customization that is possible by writing C programs, and writing your own programs. Most systems come with a lot of pre-installed developer tools. These tools are themselves useless to your typical user; it is that the installation process for software tends to compile from source when installing packages that lends itself to these tools being omnipresent.
Most development for OSX software occurs in XCode because realistically XCode is the only way to create native OSX Applications. In fact, from what I can tell, you cannot run software for Linux on OSX. If you are lucky you might be able to compile and run it, but that seems to be the exception rather than the norm. Given the same core designs and tools it is probably easier to get a software product designed only for Linux to run on OSX, though.
On Linux I find development of any semi-serious software product to be a nightmare. oftentimes even using a product can be a time-wasting endeavour.
Some time ago, I upgraded a system I had from Linux Mint 10 to Linux Mint 12. On Linux Mint 10, I was using a piece of software called "Desktop Drapes" in order to have a slideshow feature on the system. It worked.... it didn't work on Linux Mint 12. Searching about it it was because the Desktop Environment was changed. I searched on the issue further and I found that there was a plugin for Drapes to add support for Gnome 3. So I installed it. Drapes started and claimed to be running but the desktop wallpaper never changed. There were no logs to read. Searching further, I found there was a patch available for the plugin that fixed a problem with the plugin. Unfortunately, after trying it, I then discovered that the patch itself didn't work, and I needed another patch to apply to the first patch before I applied the patched patch to the plugin.
At that point, I decided not to use Desktop Drapes, and ended up spending a weekend constructing a python script to do the job manually by calling the gsettings program.
Does that same script work on later versions of Linux Mint? I don't think so. I'd be surprised if either Cinnamon or Mate had gsettings and it worked similarly.
Realistically the greatest obstacle to writing software on Linux is simply because of how much different other software could be running on the system providing capabilities you need. With Windows- you know a lot of stuff that is running, so you don't need to consider "well, what if they have a different registry handler" or "What if their desktop environment doesn't support per-pixel alpha-blending of device independent bitmaps?" That sort of thing. With a Linux application in order to be robust and run on more distributions you either need to restrict what your program does- keep it in the CLI, primarily- or you need to support the different cores that each distribution may have for the added features you want.
Really they just work different. Linux is more for the type of person who wants to customize their system a lot, and doesn't mind doing a lot of research, writing a lot of scripts or submitting pull requests or running patches to try to get that feature. And I won't deny it's pretty cool to have your own specially configured system after doing so (which is why I was writing a few scripts at the time). But it's not better or worse. For me I prefer the Windows way because when I'm not working I prefer not to need to screw around with other people's source code too much, unless that is part of my goal, nor do I find that Open Source has any specific value to me as a user since even though I'm a programmer I don't have the time to go and audit every single piece of software I run. Different strengths for both, resulting in people with different preferences.
As to the topic? I think it has a LOT more to do with the development of the software you run, and far less to do with the OS. Its sort of the same deal as porting a Console game to PC and having it run terribly; nobody says "ah, that is because the PC is a terrible platform"; they rightly know that it is the fault of those who ported or constructed the port of the game to the PC. Same applies between different operating systems. Trying to use "*nixisms" in a Windows environment during development can cause performance problems. Assuming certain things work a certain way because they worked that way on the OS you are familiar with can lead to difficult to trace performance issues or bugs. That sort of thing. If a piece of software runs worse on one operating system versus another, it is not the fault of the Operating system, but generally the fault of those responsible for the software. There are some exceptions I suppose but performance wise such disparate performance between the same software on two systems is likely a result of disparate effort put into the software running on those two systems, and not caused by one of the Operating Systems being more or less 'efficient'.