The amount of idiots here saying that Vista is so much worse than 7 is really annoying me. They are essentially the same thing, 7 just has some UI tweaks and some fancy features.
I meant the hardware is shite, although I don't think of Vista as being terribly user friendly and I find the GUI ugly. Its really just personal preference though.
You know what's not hard to do? Download W7 for free! Who could of ever thought of something as complex as that?!
Everyone with common sense.
I use W7 and I prefer it because I've figured out how to do things on W7 that I didn't know how to do on XP (mostly because of the age difference) which makes me more comfortable with something I can properly use.
No one should be using Windows XP; it's over a decade old, and has a gaping security hole -- namely, the running of all programs with administrative privileges by default, without asking if you'd like it to.
To be fair, you can configure it using LUA (Limited User Accounts) but nobody does and even if you do the functionality is terrible- you have to switch user or logout to install software or perform admin tasks, for example.
No one in their right mind should use an operating system that runs everything as root without so much as prompt to ask if that's okay.
I like when people are clueless and say stuff like "But why should I need permission! I'm the administrator and it's my computer WAAAH". They fail on so many levels.
Win7 is, before anything else, a huge bugfix of Vista. Memory issues are fixed, and the overall system is far more stable.
What huge bugs did it fix? What Memory issue? The only thing that changed drastically from Vista to 7 was your average PC hardware, which is almost entirely responsible for it's warmer reception, as well as the vast majority of "stability issues" people purport as being fixed. Windows Vista introduced a number of new software technologies expanding through the DDK as well as the user-facing software, it's no wonder it took a few years for software vendors to get accustomed to it. As with any Version of windows, the vast majority of problems are attributable to third party components or crapware that was preinstalled, as well as drivers. Windows 7 was a good marketing move because it basically stapled a new face on something that everybody had already unanimously decided to hate (Vista); so people were able to reevaluate the software, and find that over the succeeding years software vendors had learned how to work with Vista's new API's properly, and Drivers were now using the Driver Model, or at least made Vista aware.
The changes requiring new code to be written or rewritten for Windows 7 amounted to the features like jumplists; with Vista, it was damn near everything, sort of like the original speedbump between 9x and NT with XP; anybody who was around at XP's release knows how much hate it got- Programs for 9x didn't work, you couldn't use the same drivers, etc. so early on, XP systems weren't very stable. The difference is that eventually people used XP and found it wasn't as bad as they thought, whereas with Vista people are using Windows 7 and finding it "improves" on the problems they had with Vista, even though those problems are attributable to the hardware they had at the time or drivers or other software
I like when people are clueless and say stuff like "But why should I need permission! I'm the administrator and it's my computer WAAAH". They fail on so many levels.
Can you please elaborate? I never got why UAC is so important.
I've always wondered if it would be possible for Microsoft to make an update that would implement something equivalent to UAC, on XP.
Oh. it's probably possible, but there really isn't a good reason for them to do so. I think there are third party components that can provide it, though I'm not sure about that.
Can you please elaborate? I never got why UAC is so important.
It's simple. The most common way for a windows system to get infected is that the user runs that super happy screensaver.exe or whatever file.
On XP, every user is an administrator. the important thing is that every application they run has full administrative privileges. So if that super happy screensaver is a trojan, that malicious code can do whatever it wants to the system. Direct disk access to write to the boot sector, changing system files. It can even install a service to run under the Local System Account, giving it even more free range over the systems internals.
Now, I don't know about anybody else, but I don't see any good reason to give those programs you run everyday full administrative privileges. Pre-Vista, Windows support for LUA existed, but was really only something that was worth the effort on Domains; it required a lot of fiddling about with knobs, dials, GPEdit, etcetera. for XP you could, for example, create non-admin accounts, but if you needed to do any admin tasks you had to log off and log back on, making it a gigantic pain.
UAC tries to address this. When UAC is enabled, when you log on, your account's Security Identifier is stripped of all the administrative privileges. Any Application requiring those privileges will give you a UAC prompt (if they are written properly) or fail altogether (if they aren't) in the latter case you would need to "Run as Administrator" but you still get the UAC prompt. The point is to make sure that all the applications that run with administrative privileges have your permission to do so.
This also mitigates the problem for even those applications you might otherwise trust; in XP since everything ran with admin privileges, a security problem in Firefox, or Chrome, or Word, or Excel, or Internet Explorer, could take over your machine. A simple oversight in a javascript parser could be all that a malicious programmer needs to install a Remote Access Trojan on your system, without you even knowing it. UAC prevents this because the browser isn't running with admin privileges, so the damage can only extend to what your stripped user token is capable of, which for any sort of malicious software is usually pretty useless.
Vista's only problem with UAC is that people got so used to the prompt that they unconditionally said yes anyway, making the function useless. Win7 improved it and added more options so that more of the prompts were "important" decisions, mostly for when you ran a executable that required admin permissions for whatever reason. Then you could decide whether you were willing to hand over those abilities. UAC is sort of like a gynecologists assistant that makes sure the gynecologist (the software program) doesn't do anything inappropriate without your permission.
But 90% of virus infections are caused by user errors. Like you said, for example, running a random exe file could easily infect your OS and cause major problems. So Microsoft added a security measure called UAC that should double check your decision to run a file.That's if I understood you correctly.So you can still get infected if you confirm to run that file, which I think most users do (click allow as fast as they can to get rid of the notification). You can call me stupid and ignorant but the first thing I did when I installed windows was turn off stuff like UAC, firewall, win defender etc. Those things annoyed the hell out of me when I was using my friend's computer (a few years back when I had XP). If I decide to click on a untrusted and probably infected exe or other shady stuff then I deserve to get infected and I will gladly take responsibility for my actions.
For the other 10% where there are security gaps in programs, I choose to rely on my AV and things like Adblock and Noscript on browsers.
Windows 7. I stuck with XP for quite awhile, but I was leaving my computer with my uncle to upgrade while I was on vacation and he decided to throw in 7 Home Premium white he was at it.
Whether or not Windows 8 will fail miserably with the way it looks right now, I'll stick with 7 unless it's necessary to upgrade to a newer version.
The new boot/start menu looks horrible. The windows explorer updates are fine but everything else is crap.
Yeah, it looks like they're trying to appeal more to mobile users with the focus on the Metro interface, and PC users who have been using their OS for decades and prefer the "desktop" style are forced to stick with the dumbed-down (from what I've heard) Classic interface. Like I said, I'm only upgrading if anything needed becomes exclusive to it, like DirectX 12 or new games or something.
But 90% of virus infections are caused by user errors.
not user error's, per se, but rather lack of user vigilance. But I imagine that's your point. The user will always be the weakest link in the security chain; the only thing that can be done about that is education.
UAC doesn't address the security concern, directly; but rather it makes the more secure choice- running most applications with fewer privileges- more convenient. Anybody who has set-up and tried to use XP or earlier with limited user accounts knows what a gigantic pain it is, and half the time nobody is bothered. With *nix, all programs run with the privileges of the user, which is typically relatively limited (no writing to usr/bin, etc/bin, etc/fstab, etc) but good enough for most applications you would run, including X. When you need to run an administrative program, however, the task is simple- you either use su or sudo (depending on the distro) the program. Now, what this is saying is that "I trust the program to have full administrative permissions". The point is to give the user the power, rather than just give everything free reign. With XP, for example, you either ran a program as the administrator, dealt with the hairy issues of setting of a Limited User Account, or didn't run the program at all; it was not at all easy to run the program in a unprivileged mode.
Like you said, for example, running a random exe file could easily infect your OS and cause major problems. So Microsoft added a security measure called UAC that should double check your decision to run a file.
Again, UAC itself isn't the security measure per se; since the security measure could be implemented in all versions of NT; the security measure is the stripping of access rights, and selective re-adding of privileges via the consent dialog. Another important point is that unlike many other dialogs, the UAC dialog is not something that can be manipulated programmatically- A program can't move the mouse and click the button for you, for example. This makes absolutely sure it is the user that made the decision, and not a piece of software that tried to choose for them as well.
So you can still get infected if you confirm to run that file, which I think most users do (click allow as fast as they can to get rid of the notification).
Yes, but that is unavoidable. Most users when faced with any dialog, just want it to go away. UAC is more for intermediate users, who understand basic risks. For example on XP that super happy fun.exe program would run with full admin privileges the moment it was executed. Vista or later with UAC enabled will provide a prompt. Now, the careful observer at this point would wonder "why the heck does a screensaver need admin permissions?" And probably click no; the program will probably still run, but without admin privileges, which would affect the ability of a embedded trojan to do anything nasty.
You can call me stupid and ignorant but the first thing I did when I installed windows was turn off stuff like UAC, firewall, win defender etc. Those things annoyed the hell out of me when I was using my friend's computer (a few years back when I had XP). If I decide to click on a untrusted and probably infected exe or other shady stuff then I deserve to get infected and I will gladly take responsibility for my actions.
The only thing out of that set I have enabled on my system is UAC. the firewall is pointless because at the moment the system in question isn't internet connected,(and when it is it's behind the firewall of my router anyway), and by the time a threat get's to the point where a software firewall can mitigate the problem the damage is already done anyway; I subscribe to the idea that it's better to prevent a RAT from being installed to begin with then silence it's attempts to phone home.
For the other 10% where there are security gaps in programs, I choose to rely on my AV and things like Adblock and Noscript on browsers.
Makes sense. For the record, except for a short stint after dealing with a virut infection on my main PC (which it turned out wouldn't have been stopped by any AV at the time, wonderful) I've not used AV software on my Windows machines. While I was using XP I'd get an infection about once every 6 months or so, deal with it using recovery console or whatever, and be fine. I've had things try to infect my PC since moving to Vista and 7, usually for the reason you note, allowing admin permissions for something. For me, what I find important is that when I run the foreign file, the dialog reminds me "OK, better watch for any suspicious activity" Assuming I decide to run it at all, of course. Also, a lot of malware isn't written to request admin permissions, so it doesn't get them at all. I've had "infections" of my standard user account entirely nullified simply because the malware thought it had full control but actually got permission denied errors when it tried to "infect" the system, download files from a remote source, etc. Dealing with UAC on the software side requires more attention; if you don't request admin permissions, consent.exe never shows the UAC dialog and the program soldiers on under the stripped user Token. This is particularly important since a vast majority of malware doesn't try to elevate itself; so it basically sit's there assuming it has full control and just stumbling on it's own feet. The downside is that sometimes if it's a trojan, the result is that the program the trojan is in doesn't work either, so the user often ends up running the program as administrator to "fix" it (because they were told to try that on a forum, or something), so it's arguable how much benefit it truly has. For me though, I like it because it changes the infection vector from "you're ****ed and can't do anything about it" to something within my power to corrale.
Another point is that not all executables that run are invoked directly by the user, particularly with regard to exploit code that can download a file and run it using the system shell functions. On XP or a non-UAC enabled system the program would have full admin rights and would trample all over; with UAC, though, you'd get a prompt, out of nowhere, from what would probably be a randomly named executable. Obviously some percentage of users will blithely allow it; but for me the important thing is that there is a contention plan. AV software, in my experience, doesn't protect you very well (it does something of course, but it cannot protect you from new threats, and the new threats are always the most dangerous). Also programs can place registry entries using the standard user account that attempts to execute the programs with higher privileges, so at startup you'd get a UAC prompt. Same with various Browser add-ons that could be called infectious. (mostly for IE, really).
For AV software, my main gripe is when the software declares your system as "clean" or "free of viruses" (don't know if they still do that). Why? Well, all the AV software can do is determine that your system, at least as far as it can tell, doesn't have any known infections. But completely new malware that isn't heuristically similar to existing malware could easily have it's greasy paws in your system; and the AV still declares "0 VIRUSES" or "you are clean". arguably it's a pedantic take on the matter but I've always felt it gives people a false sense of security; just because your AV says you're system is free of infection doesn't mean it truly is; and then you have the various false positives, and the instructions I sometimes see "you should run the program with your AV disabled" which sort of defeats the entire purpose of the AV, I would think.
Well they did make it a bit faster I guess, but on most modern computers you'll see no difference in performance. I really liked Vista, but now I'm using 7. I would have a dual-boot though but I'm way too lazy to partition my HDD.
The point was more that I ALWAYS read comments about people rambling nonsensibly about how much of a "RAM hog" Vista was. The only reason it was a "RAM hog" is because everyone was using garbage XP hardware computers when it came out, trying to run off 1 GB or so. Now they're using 4, 8, 16 GB with Win7 and suddenly think the world got flipped upside down.
Even those numbers I pointed out are miniscule on a system with 4 GB of RAM that's like a 3% RAM usage drop or so, you'd have to be a machine to notice the difference in that.
Somalia. Because it puts the ARRRR in anarchy! DOHOHOHOHOHOHOHOOH.
I meant the hardware is shite, although I don't think of Vista as being terribly user friendly and I find the GUI ugly. Its really just personal preference though.
2 or 3 months.A long-ass time.Oh boy, visual basic. I can barely contain my excitement. Not.
And Aerosnap, I love that thing.
You're dumb, get over it. Here I'll even zap your little brains with comprehension.
XP <<< Vista < Win7
That's basically feature wise too, nothing about Win7 is inherently better than Vista to condone any sort of reason to upgrade.
Everyone with common sense.
I use W7 and I prefer it because I've figured out how to do things on W7 that I didn't know how to do on XP (mostly because of the age difference) which makes me more comfortable with something I can properly use.
Can you please elaborate? I never got why UAC is so important.
Oh. it's probably possible, but there really isn't a good reason for them to do so. I think there are third party components that can provide it, though I'm not sure about that.
It's simple. The most common way for a windows system to get infected is that the user runs that super happy screensaver.exe or whatever file.
On XP, every user is an administrator. the important thing is that every application they run has full administrative privileges. So if that super happy screensaver is a trojan, that malicious code can do whatever it wants to the system. Direct disk access to write to the boot sector, changing system files. It can even install a service to run under the Local System Account, giving it even more free range over the systems internals.
Now, I don't know about anybody else, but I don't see any good reason to give those programs you run everyday full administrative privileges. Pre-Vista, Windows support for LUA existed, but was really only something that was worth the effort on Domains; it required a lot of fiddling about with knobs, dials, GPEdit, etcetera. for XP you could, for example, create non-admin accounts, but if you needed to do any admin tasks you had to log off and log back on, making it a gigantic pain.
UAC tries to address this. When UAC is enabled, when you log on, your account's Security Identifier is stripped of all the administrative privileges. Any Application requiring those privileges will give you a UAC prompt (if they are written properly) or fail altogether (if they aren't) in the latter case you would need to "Run as Administrator" but you still get the UAC prompt. The point is to make sure that all the applications that run with administrative privileges have your permission to do so.
This also mitigates the problem for even those applications you might otherwise trust; in XP since everything ran with admin privileges, a security problem in Firefox, or Chrome, or Word, or Excel, or Internet Explorer, could take over your machine. A simple oversight in a javascript parser could be all that a malicious programmer needs to install a Remote Access Trojan on your system, without you even knowing it. UAC prevents this because the browser isn't running with admin privileges, so the damage can only extend to what your stripped user token is capable of, which for any sort of malicious software is usually pretty useless.
Vista's only problem with UAC is that people got so used to the prompt that they unconditionally said yes anyway, making the function useless. Win7 improved it and added more options so that more of the prompts were "important" decisions, mostly for when you ran a executable that required admin permissions for whatever reason. Then you could decide whether you were willing to hand over those abilities. UAC is sort of like a gynecologists assistant that makes sure the gynecologist (the software program) doesn't do anything inappropriate without your permission.
Ah, I see.
But 90% of virus infections are caused by user errors. Like you said, for example, running a random exe file could easily infect your OS and cause major problems. So Microsoft added a security measure called UAC that should double check your decision to run a file.That's if I understood you correctly.So you can still get infected if you confirm to run that file, which I think most users do (click allow as fast as they can to get rid of the notification). You can call me stupid and ignorant but the first thing I did when I installed windows was turn off stuff like UAC, firewall, win defender etc. Those things annoyed the hell out of me when I was using my friend's computer (a few years back when I had XP). If I decide to click on a untrusted and probably infected exe or other shady stuff then I deserve to get infected and I will gladly take responsibility for my actions.
For the other 10% where there are security gaps in programs, I choose to rely on my AV and things like Adblock and Noscript on browsers.
Whether or not Windows 8 will fail miserably with the way it looks right now, I'll stick with 7 unless it's necessary to upgrade to a newer version.
The new boot/start menu looks horrible. The windows explorer updates are fine but everything else is crap.
2 or 3 months.A long-ass time.Oh boy, visual basic. I can barely contain my excitement. Not.
Yeah, it looks like they're trying to appeal more to mobile users with the focus on the Metro interface, and PC users who have been using their OS for decades and prefer the "desktop" style are forced to stick with the dumbed-down (from what I've heard) Classic interface. Like I said, I'm only upgrading if anything needed becomes exclusive to it, like DirectX 12 or new games or something.
Ya having a larger start menu where I have room for all my shortcuts and can get to them quicker sure does suck.
Here's why I think you're dumb.
http://www.sysprobs....s-7-memory-test
You're also being trendy, just trying to sound like Vista is worse because everyone else told you it is.
not user error's, per se, but rather lack of user vigilance. But I imagine that's your point. The user will always be the weakest link in the security chain; the only thing that can be done about that is education.
UAC doesn't address the security concern, directly; but rather it makes the more secure choice- running most applications with fewer privileges- more convenient. Anybody who has set-up and tried to use XP or earlier with limited user accounts knows what a gigantic pain it is, and half the time nobody is bothered. With *nix, all programs run with the privileges of the user, which is typically relatively limited (no writing to usr/bin, etc/bin, etc/fstab, etc) but good enough for most applications you would run, including X. When you need to run an administrative program, however, the task is simple- you either use su or sudo (depending on the distro) the program. Now, what this is saying is that "I trust the program to have full administrative permissions". The point is to give the user the power, rather than just give everything free reign. With XP, for example, you either ran a program as the administrator, dealt with the hairy issues of setting of a Limited User Account, or didn't run the program at all; it was not at all easy to run the program in a unprivileged mode.
Again, UAC itself isn't the security measure per se; since the security measure could be implemented in all versions of NT; the security measure is the stripping of access rights, and selective re-adding of privileges via the consent dialog. Another important point is that unlike many other dialogs, the UAC dialog is not something that can be manipulated programmatically- A program can't move the mouse and click the button for you, for example. This makes absolutely sure it is the user that made the decision, and not a piece of software that tried to choose for them as well.
Yes, but that is unavoidable. Most users when faced with any dialog, just want it to go away. UAC is more for intermediate users, who understand basic risks. For example on XP that super happy fun.exe program would run with full admin privileges the moment it was executed. Vista or later with UAC enabled will provide a prompt. Now, the careful observer at this point would wonder "why the heck does a screensaver need admin permissions?" And probably click no; the program will probably still run, but without admin privileges, which would affect the ability of a embedded trojan to do anything nasty.
The only thing out of that set I have enabled on my system is UAC. the firewall is pointless because at the moment the system in question isn't internet connected,(and when it is it's behind the firewall of my router anyway), and by the time a threat get's to the point where a software firewall can mitigate the problem the damage is already done anyway; I subscribe to the idea that it's better to prevent a RAT from being installed to begin with then silence it's attempts to phone home.
Makes sense. For the record, except for a short stint after dealing with a virut infection on my main PC (which it turned out wouldn't have been stopped by any AV at the time, wonderful) I've not used AV software on my Windows machines. While I was using XP I'd get an infection about once every 6 months or so, deal with it using recovery console or whatever, and be fine. I've had things try to infect my PC since moving to Vista and 7, usually for the reason you note, allowing admin permissions for something. For me, what I find important is that when I run the foreign file, the dialog reminds me "OK, better watch for any suspicious activity" Assuming I decide to run it at all, of course. Also, a lot of malware isn't written to request admin permissions, so it doesn't get them at all. I've had "infections" of my standard user account entirely nullified simply because the malware thought it had full control but actually got permission denied errors when it tried to "infect" the system, download files from a remote source, etc. Dealing with UAC on the software side requires more attention; if you don't request admin permissions, consent.exe never shows the UAC dialog and the program soldiers on under the stripped user Token. This is particularly important since a vast majority of malware doesn't try to elevate itself; so it basically sit's there assuming it has full control and just stumbling on it's own feet. The downside is that sometimes if it's a trojan, the result is that the program the trojan is in doesn't work either, so the user often ends up running the program as administrator to "fix" it (because they were told to try that on a forum, or something), so it's arguable how much benefit it truly has. For me though, I like it because it changes the infection vector from "you're ****ed and can't do anything about it" to something within my power to corrale.
Another point is that not all executables that run are invoked directly by the user, particularly with regard to exploit code that can download a file and run it using the system shell functions. On XP or a non-UAC enabled system the program would have full admin rights and would trample all over; with UAC, though, you'd get a prompt, out of nowhere, from what would probably be a randomly named executable. Obviously some percentage of users will blithely allow it; but for me the important thing is that there is a contention plan. AV software, in my experience, doesn't protect you very well (it does something of course, but it cannot protect you from new threats, and the new threats are always the most dangerous). Also programs can place registry entries using the standard user account that attempts to execute the programs with higher privileges, so at startup you'd get a UAC prompt. Same with various Browser add-ons that could be called infectious. (mostly for IE, really).
For AV software, my main gripe is when the software declares your system as "clean" or "free of viruses" (don't know if they still do that). Why? Well, all the AV software can do is determine that your system, at least as far as it can tell, doesn't have any known infections. But completely new malware that isn't heuristically similar to existing malware could easily have it's greasy paws in your system; and the AV still declares "0 VIRUSES" or "you are clean". arguably it's a pedantic take on the matter but I've always felt it gives people a false sense of security; just because your AV says you're system is free of infection doesn't mean it truly is; and then you have the various false positives, and the instructions I sometimes see "you should run the program with your AV disabled" which sort of defeats the entire purpose of the AV, I would think.
The point was more that I ALWAYS read comments about people rambling nonsensibly about how much of a "RAM hog" Vista was. The only reason it was a "RAM hog" is because everyone was using garbage XP hardware computers when it came out, trying to run off 1 GB or so. Now they're using 4, 8, 16 GB with Win7 and suddenly think the world got flipped upside down.
Even those numbers I pointed out are miniscule on a system with 4 GB of RAM that's like a 3% RAM usage drop or so, you'd have to be a machine to notice the difference in that.