Hacker Newsnew | past | comments | ask | show | jobs | submit | more mback00's commentslogin

Poor and stupid - Simple you are. In a civil society those that provide a utility (or "fun") to others are rewarded and incentivized.


I have nothing against people being rewarded for doing stuff others enjoy. What I have a problem with is the freemium model that seems to have taken over. It's nothing but greed what some developers are doing. I had quite a few games that I paid for. Then, one day, the app claims that "due to compatibility issues, I need to download The New $game to continue receiving updates". Then you go and look at the "new" version to find it's free and full of IAPs. THAT is nothing but greed.


The problem here is not the app. People need to understand that the road /is/ public! Your neighborhood does /not/ own the road - everyone does. IMO, The solution to the problem of conjestion is opening of carpool lanes to all travelers, aggressive clearing accidents, creation of visual barriers (blinds) to accidents, and above all global acceptance of driverless systems.


The next thing we need to do is change the language of the road to "incidents" instead of "accidents". An accident implies that there is no way to avoid the outcome of an incident. An incident implies that a traffic collision has occurred. Once we rule out human error or any other cause for the incident, then it becomes an accident. We will find that many of the traffic collisions that we are currently referring to as accidents could have been prevented in many ways:

- remove vast amounts of human error by automating driving and being much more strict for driving infractions (Inattention, reckless driving, DUI)

- infrastructure that encourages speeding / leaves no options for pedestrians or cyclists causing incident hotspots.

This will allow us to focus on the safety issues that really are accidents, and usually caused by poor maintenance on vehicles and infrastructure.


He should have mentioned Chuck Norris in the discussion.


On that note, anyone feel like maybe we need a Godwin's Law for Star Wars & Star Trek? Someone will always mention the Force, Jedi, "young Paduwan," the droids you're looking for, a bad feeling about this, on the one hand, and warp speed, beam me up, fascinating, the holodeck, the Borg etc. on the other. Or maybe that's just the nerds I associate with.


VI !!!

^_^


Because we increased the number of jobs in the public sector, because we instituted a healthcare system where we get less for paying more into it (and one that makes it attractive for business to decrease permanent labor), because our tax policy makes it more attractive for business to hold money overseas, and because government through regulation continually makes it more difficult for individuals to succeed and produce a good or service.


IMO -- Better is to just write better tools for plain olde ANSI-C. We can build tools to check C as well as any other. Let's just not waste time on building further complexity into our toolset and look at building toolkits and best practice at the very flexible level of abstraction that Plain olde C provides. With the right tools and expertise, we can improve the world without the never ending whiz bang rewrite.


  > We can build tools to check C as well as any other.
This is simply not true; language semantics determine the kinds of tooling you can build. Ruby is so dynamic that it's notoriously hard to build tools for, for example.


U need to branch out and prove what you are saying is true -- and u don't need to relocate... That is the strength of being a uber hacking sw "man of foo" that you are. If you are right, then your true value is an order of magnitude more than your employer is willing to pay. "Good!" -- prove it!!! Go to stack exchange and up your rep. Market yourself and go grab another job... But in all cases, (and respectfully) stop with the whining already.


"Not lucky enough" - that just gauls me every time I hear it. If someone chooses to slack off (only HS education w/o trade and w/o entreprenurial ambition)... They are automatically ranked by the willfully ignorant as "unlucky." Please! There has never been a place and time in the history of mankind when a man with a will could not make something of himself as today! A man with an idea can easily and quickly form a company of one... And can hire overseas to build his idea... Get it shipped to any market he chooses, advertise his idea in any way he chooses, and make a profit in any way he chooses. Today, the world is open to any individual that chooses to work.


Judging others is an impossible task. We use the word "unlucky" because we do not truly know that person. Yes, they could be lazy, but they could also have mental or physical problems that are beyond their control.

So, we say "unlucky" because it hints at the fact that many of the gifts that we humans have were not individually earned (intelligence, will-power, mental sanity, life, general health, your parents, etc). Nobody is in full control of their life, so we cannot judge them with impunity for the way their life worked out.


Those working lads extracting diamonds in Congo with their bare hands don't have much of an opening in the world either. I'm sure they don't work hard enough to deserve it.


It's pretty clear the domain of this discussion is first world, Western countries of Canada, US, and the EU.


But of course! Let me cater to those of your kind:

Those low class lads working for less than the minimum wage without any means nor family support don't have much of an opening in the world either. I'm sure they don't work hard enough to deserve it.

Refer to wikipedia for any futher clarifications on "poverty" or "poverty in the First World" in general.


Very funny. Solutions to relative poverty in the first world are different from solutions to absolute poverty everywhere else.


The comment I was replying to wasn't talking about solutions, nor does the article, not in depth.


I have outgrown a few laptops since 2006 when I finally gave up dual boot and just ran with Linux. Linux admittedly does have the occasional mixup (especially about competing graphical interfaces and init systems) and does sometimes need a little experimentation to get things working as I would like; however, these seeming inconveniences are mostly caused by newly presented "options" and are not about forced lock-in. Besides, Linux also has some very positive qualities that have become essential to me. Linux maintains compatibility with old code (sometimes very very old code) that is still useful to me. Linux puts an astounding array of tools at my fingertips that help me automate work and learn new things. Linux also keeps me secure, virus-free and conveniently keeps everything installed on disk up-to-date... and it gradually (every 6 mos for me) gets better all the time -- for free. I can't see a reason why I would ever want to return to an OS as restrictive and inconvenient as Windows.


I went in the other direction. I used to use Linux (Ubuntu) but I got tired of essential things breaking after updates, like Nautilus, and the constant battle with getting the printer to work. Eventually I just got too frustrated, went out and bought Windows for $300 and considered it cheap.

Now I need Linux for work (software dev) again, so I gave dual-boot a try. I installed OpenSuse, Ubuntu, and Debian a total of 7 times since Christmas, each time ending up with something unbootable within days. And I spent a couple thousand dollars of my time getting them to work, fixing the horrible fonts on OpenSuse, etc. Eventually I gave up and bought VMWare where Ubuntu now sits happily isolated from real hardware, a configuration that it's somehow more stable in. And at least I have automatic snapshots on every reboot, so when something goes wrong I just roll back.

Honestly, desktop Linux is the most expensive OS out there if you count the value of your time. I really like Windows, not because it's a pleasure to use (Windows 8 with that metro crap was terrible!) but because it just works. If I can't get desktop Linux to work nicely as software engineer, what chance does your average person have? Unless some company does to Linux what Apple did to BSD (SteamOS?), I don't see any hope for it ever being anything but a niche desktop OS.


>Honestly, desktop Linux is the most expensive OS out there if you count the value of your time.

Counterpoint: I also value my freedom, privacy, mental health, and the longevity of my hardware.

Freedom: I can install it whenever I want on whatever I want. I groan whenever I have to deal with "activation" headaches. It also works just the same on ARM - and anything else! All the programs I use are open source so they come right along too.

Privacy: Even setting aside the Windows 10 debacle, Windows is incredibly noisy on the wire. Linux doesn't make a peep unless I ask it to.

Mental health: System updates happen when I say. It doesn't strong-arm me into restarting when I'm busy doing something else, or hold my system hostage while it does god-knows-what on boot. As for application updates, I have a package manager, so I'm not bugged by a dozen different things like flash updater, java updater, etc. In fact, stuff doesn't spontaneously "happen" in general. If a system service starts chewing up resources or otherwise behaving badly (a rare occurrence) I can a) notice, because it's not lost in the noise of normal system chatter and b) actually find out what it is instead of it being hidden behind "svchost.exe". The primary interface (command line) is comparatively stable - I don't have to keep relearning where things are (it's not perfect in this respect, but it's better than Windows).

Longevity of hardware: I'm writing this on my 2007 EeePC 901. It has 1 gig of ram and runs Debian flawlessly. I never need to restart it and my load average is somewhere around 0.2. Can any Windows do that?

Look, I don't mean to come across as belligerent. You're entitled to choose whatever system makes you comfortable and allows you get work done. But I find Windows wastes far more of my time than Linux.


I'm actually somewhat surprised at the number of people writing that Linux saves them time. Clearly not everyone has had the horrible experiences I have. I'm curious what distro you people use and on what hardware. I tried Ubuntu, OpenSuse, and Debian. All of them, excpting Ubuntu, needed so much work out of the box just to get to parity with a fresh windows install as to cost me more than a Windows license. Ubuntu was the least stable after install. Updates would frequently break it. I didn't even make it a week this time, and my past experience with it (LTS version) has been that every few months something new would break after updates.


Yeah, it sounds like you've had some really bad luck with hardware.

Since you asked, for what it's worth, I invariably run Debian-based distros. I haven't touched Ubuntu since they decided to get weird with the interface. This laptop is a ThinkPad X61T running Linux Mint + Cinnamon, which is only "okay" - little laggy at times but it works fine. Surprisingly, the fastest-feeling, stablest, easiest-to-use machine I have is... the ancient EeePC. Standard Debian + MATE desktop. The thing about Debian is, if you can put up with somewhat out of date software, is it just does. Not. Break.

I don't really understand where your problems with multiple monitors come from. I just plug 'em in and they work. It's been a while since I had a desktop with an NVidia chip - all my laptops have Intel graphics - but NVidia's self-extracting installers have always worked for me (although they're built against the kernel so you might have to re-run them every upgrade).

I've had the repository problem you describe, long ago - the UK repos were being weird so I switched to France, which worked fine. Nowadays though all you have to do is use http://httpredir.debian.org/, which will always serve you the file from the fastest place it's available from. Never had an issue with it.

I guess I won't deny that it can sometimes be a hassle getting things working just right on a fresh Linux system. But at least you only have to do it once.


Ubuntu LTS versions on totally unsuitable made-for-windows laptops (my next one will not be). I've never (touch wood) had an Ubuntu LTS release break during an update. However, I do a fresh install for each LTS release (takes about 45 minutes -- I keep my /home on it's own partition and the new install finds and uses it).

EDIT And I don't get how a Linux distro can take more work than Windows to get into a useful state. With Linux I just apt-get the software I want and start work. With Windows installing even a minimal python/R/unix tools/compiler/ssh client/office package/tex distribution/browser/video viewer set of software etc takes ages, lots of baby-sitting and frankly nerves as one navigates around the malware.


Debian needs serious help to get the multiple monitors configured and working at full resolution. OpenSuse had that problem plus terribly ugly fonts. It took a lot of reading up online and trying things before I figured out how to fix that. Ubuntu actually had a broken apt-get for me on some recent installs, it turned out to be an issue with using the us subdomain repos, switching to de made apt-get painfully slow, but solved the problem. Luckily I haven't had this issue on my new VM install, so I'm hoping it was resolved. All of them had issues booting (black screen) with the default open-source display driver and required the proprietary drivers to be installed (this is not an issue in a VM.) Just to get Spotify to run on OpenSuse was a herculean effort that I eventually gave up on, but not before buggering up the system.


> Debian needs serious help to get the multiple monitors configured and working at full resolution

I must have some pretty reduced needs. I just use one xrandr command to tell all my monitors what position to be in and it works - and I think my wife uses the GUI program in Ubuntu's Unity desktop without any pain.

> Ubuntu actually had a broken apt-get for me on some recent installs, it turned out to be an issue with using the us subdomain repos

Did you submit a bug report about this? Seems like a bad bug for a distro like Ubuntu.

> Spotify

Spotify is not supported for OpenSuse (it's beta at best on Ubuntu -- but runs very nicely). Trying to run propriety software on unsupported platforms seems like a very bad idea. And of course installing a bunch of random things in random places will spanner an OS. Btw if you want to run Spotify on unsupported platforms I think that there is an addon for Clementine that works nicely and the addon gets installed in your home dir so won't break things (haven't used it for years because the Spotify client works fine for me so things may have changed).

EDIT

I was wrong Ubuntu doesn't install Nvidia drivers for you. You need to install them after install and the process looks annoying.


> Honestly, desktop Linux is the most expensive OS out there if you count the value of your time.

I think these words are actually on the cover of the 2004 edition of the Microsoft Anti-Linux Talking Points Guide.

It's not that it isn't credible that you had a bad experience, it's that people have bad experiences with Windows too, and often.

And the Windows problems are harder because the solutions are typically some kind of hideous workaround to the fact that the real problem can't be fixed, instead of the much-maligned Linux solutions that involve typing things into a terminal but, if you type the things into the terminal, it actually fixes the problem.


Yes, I remember the talk about total cost of ownership. But it's entirely valid in my experience. I wouldn't ever use desktop Linux again outside of a VM. The bottom line is that my experience with Windows is rock solid. It looks good out of the box (metro asside), works out of the box, even supports SLI, gaming, multiple monitors without any fight. Even "dist upgrades" work. I've never had an issue with malware because I know what to do and what not to. And it's so nice that third-party software runs and runs well on Windows. Many companies don't support Linux or only offer a stripped-down product that barely works.


> I think these words are actually on the cover of the 2004 edition of...

So what? The words that you're saying are on the tips of every Microsoft-hating Linux fanboy's tongue.

> ...the Windows problems are harder because...

Oh please. The Linux problems are harder because most of the time the solution doesn't exist at all ~ (e.g. Does a driver even exist for a particular device? Does it actually work though? Can I get a particular piece of industry-standard software? Probably not (but oh - here's an actually hideous emulation (WINE) of a much better GUI system (Windows) that you can TRY and run it on... ).

> ...if you type the things into the terminal, it actually fixes the problem.

Good one - please tell me what to type to be able to run Linux on a touch screen tablet, so I can run Linux in the same places I can run Windows. Also, what do I type to get a decent desktop GUI experience?


Oddly enough, I've had exactly the opposite experience. I'm locked into Windows due to industry standards (translation tools are all Windows-only, and the vast, vast majority of business documentation runs on Microsoft Office), but with my last machine, I installed Ubuntu as the host OS and run Windows 7 in a virtual machine - and love it.

I even have the opposite of your unbootability experience - towards the end of 2014 I rendered my Windows machine unbootable, without hope of recovery. I did manage to boot Linux from a USB drive and rescue my files. Late last year I did a similarly stupid thing with Linux - but I was able to reinstall the kernel without needing to reinstall everything, as Windows had required.

Although you're right about the average-person thing. I shudder to think about walking my mom through a kernel reinstall.


> Honestly, desktop Linux is the most expensive OS out there if you count the value of your time

I had the opposite experience. I do tech support for my extended family and I grew tired of cleaning up viruses and chasing up drivers from questionable sites.

3 years ago, I started pushing everyone to Linux (Mint). I must say, I have saved myself a lot of time (yes, I do value my time)


Unbootable within days? As in, you install a linux distro and it can boot, you switch between using it and using windows for a few days, and then after a few days linux can't boot? That's very strange; I've never seen that in over a decade of using various linux distros and dual-booting. My guess is that some "security" related mechanism in the firmware or windows is checking on and reverting the uefi boot setup somehow.

Since virtually no one buys a laptop designed for linux, but rather hopes that linux has adapted (with all the driver "quirks" necessary) for that hardware, it's really just that linux gives you the tools and the freedom to figure things out and set them up how you want. It can't do it for you. Sometimes it's just an impractical amount of work. But depending on what you do, it's often worth it.


On OpenSuse it was a software install that went bad. The much vaunted snapshot feature didn't work out of the box! It actually could not boot snapshots in that nice little grub menu. I had to reinstall after spending hours trying to rollback to a snapshot.

On Ubuntu it updated the kernel, which made it unbootable. I could still boot the old kernel, but after taking out a ticket and being asked to upgrade my BIOS, that became unbootable as well.

On Debian it was just too much work to get to a point where it supported the real resolution of my displays and my multi-monitor setup. I eventually just threw in the towel and went and installed Ubuntu in a VM, which at least got me up and running quickly.

I'm leaving out a variety of colorful failure stories, but the bottom line is very little worked out of the box, it was insanely time consuming and difficult to get things into a decent state, and then it didn't stay in that state for long.


I stopped using Linux in the early-mid 2000s, when applying a security patch to sendmail required (at least then) that I learn its arcane compiler/build process. Burned at least a half day with no success. That was the last straw in a series of straws.

I was trying to build a business and was losing too much time. Went back to Windows, installed a free version of mailenable via point and click, then configured it through an intuitive GUI within minutes.

I'm sure I lost some tech cred on that transaction and it offended my sensibilities around MS at the time. But, man it just worked and that was priority one. I was already the CEO, developer, and customer support rep. There was just no value in adding Linux SA to the list.

Have never regretted it nor looked back.

EDIT: Interestingly, downvotes don't make any of my experience less true.


Maybe downvoted because things changes on the space of 10 or more years...


Perhaps. But, then, here I am replying to someone who is sharing essentially the same experience in the modern Linux era.


> Honestly, desktop Linux is the most expensive OS out there if you count the value of your time.

Can't agree with this more. All the "control" and "freedom" you have over PC fades quickly in the shadow of man hours needed to make things work on linux.


I wonder if you used hardware than is actually supported by the distributions you tried? If you need/want Linux I would highly recommend buying something pre-installed. You don't have to use the pre-installed OS but at least you can reasonably assume that drivers will be available.


It doesn't matter. It really doesn't. I have routinely had hardware that was perfectly supported by one distro or another immediately break on the next update.

Kernel versions are particularly picky. With Debian, I have historically regularly had to jump around between stable, testing, and unstable because one thing or another wouldn't work or the machine wouldn't even boot, because of the version of the kernel used.

The amount of hardware regressions I've run into honestly stagger me. Video is another one that's notoriously bad. I've had to abandon Linux installs on multiple distros because after some update or another, suddenly some or all of video functionality just ceased to work. Debian broke my OpenGL. Fedora developed a system freeze when upgrading to 23. Ubuntu routinely fails to recognize common hardware or even existing hardware that ran previous versions, defaulting back to ugly VGA resolutions and software-only rendering.

I frankly would take any pre-installed Linux laptop's claims of compatibility with a generous grain of salt, and basically expect that it too would eventually fail on some future version.


I wonder if part of the problem is people going outside of the OS's curated repositories.

With Debian at least the curation of the repositories is one of the biggest attractions for me. Everything in there is tested at length and is known to work well together. I've never had problems staying within the repos.

Friends on the other hand always want to install the latest version of whatever package and install tens of external PPAs to achieve that, without consideration for what that means for the stability of their system.

They still carry their windows experiences and think that to install software you have to go find the software authors website and read their instructions - which much of the time apply to other distros and not the one you are using. I have to keep teaching people the Linux way to install software, use your local repos.


It turns out the OP borked one of his attempts at Linux by trying to install Spotify in OpenSuse...

I.e. exactly what you are saying.


Nothing exotic: NVIDIA 970 graphics card, Gigabyte motherboard with Intel NIC, Haswell CPU, Samsung SSD and a couple HDDs.


I seriously don't recommend checking Linux compatibility on laptops this way (component by component). What about the wifi? How about your system's screen brightness controls? Trackpad? Etc. Getting something that is OK is takes work and getting something really good is a lot of work.

For comparison. Try installing Windows on a Chromebook. It'll suck. Install Linux on a made for windows laptop and it'll sort of work if your lucky and suck if you're not.

If you want to use Linux just buy laptops that are certified -- it's easy and there are several options these days (I'm not going to name any but google for them).


I use a desktop.


You are either unlucky or doing something very weird then.


I can't see a reason why I would ever want to return to an OS as restrictive and inconvenient as Windows.

I have never been a Windows users, but there are some reasons why I use another 'restrictive' OS after using Linux and BSD for 13 years (including on laptops):

- Microsoft Office. LibreOffice is simply not compatible enough. Though, people are moving more and more to Google Docs, so this issue might disappear in the future.

- No GUI isolation in Linux. It scares the hell out of me that any application can read any other applications keystrokes, mouse events and viewports. When you have some vulnerability in some client (browser, mail), it could listen in on passwords that you type in a terminal as well. AFAIR Wayland will solve this. But the ecosystem did not move there yet.

- The lack of consistent keyboard shortcuts across applications.

- Supposedly stable upgrades that break stuff (especially in Ubuntu and to some extend RHEL, never had this problem in my many years with Slackware).

- The lack of cutting-edge hardware with good driver support. I love my 12" MacBook and wouldn't want to go back to anything heavier and worse keyboard/trackpad.

For other users, I can imagine that these are also problems:

- Installing applications outside the distribution's repositories is still unnecessarily hard.

- There is a lot of inertia - people do not want to invest the time to learn something new.

- Business my still have many older win32 applications that do not run on other systems.

---

Anyway, I don't think the traditional Linux desktop or Mac OS X are serious threats to Windows. It's Chrome, Chrome OS, Android, and iOS.

Edit: I don't want to sound too negative about desktop Linux. I just wanted to give some possible reasons why not everybody may be happy to switch.


I had never even heard of the issue of GUI isolation before people started using it as a way of promoting the Wayland idea. It was well known that keystroke loggers were particularly easy to do in *nix type systems but once you own a user in most any environment in practice getting keystrokes (or anything else associated with that user) isn't that hard. You really have to go further and explicitly sandbox a potentially malicious program.

This sort of issue is why Android (also Linux based but doesn't use X) runs apps as separate users.


I realized that this is an issue because the SSH documentation gives warning about this in the context of remote X11:

X11 forwarding should be enabled with caution. Users with the ability to bypass file permissions on the remote host (for the user's X authorization database) can access the local X11 display through the forwarded connection. An attacker may then be able to perform activities such as keystroke monitoring.

but once you own a user in most any environment in practice getting keystrokes (or anything else associated with that user) isn't that hard. You really have to go further and explicitly sandbox a potentially malicious program.

Definitely. But I think the trust model has also changed over the years. We have gone from trusting a handful of well-vetted programs (10-15 years ago I primarily used a browser, Pine, CenterICQ and a handful of traditional UNIX utilities) to more and more programs that are all newer and typically connect to the net, embed browsers, etc. Consequently, we should trust our applications less.

As you say, you really have to sandbox each program. Apple has pushed this quite hard: applications have UI isolation and App Store applications are sandboxed. In the meanwhile, much of the Linux community has been outright hostile to this idea (except the SELinux, AppArmor, and systemd folks) because it builds walled gardens and applications are provided by trusted distributors anyway.

The reality is that people want to install applications outside what is provided in the distro repos. And perhaps, we don't even want to trust every possible application packaged in a distribution.

We should really go to a small and trusted core operating systems where everything else is sandboxed by default.


Aren't Chrome OS and Android Linux flavors? That's my understanding.


Chrome OS is a variant of GNU/Linux. Android isn't, it only has Linux kernel, but the userland is mostly incompatible with any existing GNU/Linux distros (there is some resemblance, because of *nix roots and POSIX compatibility, but not much)


That's why I said the traditional Linux desktop. Android uses the Linux kernel, but has a completely different stack on top of it. Also, more and more functionality is moving to the proprietary Google Play Services and proprietary Play Store apps. Chrome OS switched away from X11 and is just a system that boots to Chrome for the average user (and its normal use case).

(Yes I know that you can switch Chrome OS to developer mode and install Crouton.)


You are complaining about compatibility and you believe that people are moving to Google Docs to handled this?!?


No. People are using Google Docs for (some) new documents, because it handles collaboration far better. When you are using Windows or OS X, you can use both.


There's also office 365 which runs in Chrome (and probably Firefox) on Linux. The office/Windows lock-in is definetly a thing of the past.


It's indeed becoming a thing of the past. Unfortunately, the web version is not there quite yet feature-wise. But it's definitely getting better all the time.


I suppose I shouldn't speak for the editing side of things. If I do editing, it's basic Word and Excel stuff. It has been perfect for viewing though, which was what used to keep me stuck with an Office install.


Windows suffers from the same "No GUI isolation"-thing.


Linux is not that great at compatibility with old code or old binaries. Windows has always been better at this.

An example of a *nix with good backwards compat is FreeBSD. The FreeBSD cluster has binaries from FreeBSD 2 (1994) that still run. Try that with Linux -- I guarantee you the kernel and glibc broke compatibility. Hell, there are Linux games from early ~2000s that won't run anymore.


Linux a.out binaries also still run - provided you have the a.out shared libs. But if you have the old binary, why wouldn't you have the old libs?

Also the Linux games from early 2000s do run - you need the libraries they were built against.


> Linux is not that great at compatibility with old code or old binaries.

The default on Linux is source code availability. If the ABI changes but the fix is to recompile or change a couple of lines then it's a much smaller issue than it would be on Windows when you can code. And even if you can't, since the default is also redistributability, someone who can may have done it already and the fix will be in the package manager before you even knew it was a problem. Or you can submit a bug which, when the fix is that simple, some random hacker will fix it for you because it looks good on a cv or github history.

This compared with Windows where you have e.g. some driver with sources unknown that needs to be updated to support AMD64, which already supported x86 and was 64-bit clean for some RISC architecture, but now you can't do it at all. Or even more frustratingly, you have the driver source and the fix is very simple but you can't actually use or redistribute it because you're not up for the five grand and six months it takes to get a certificate for code signing.


On Windows, the usual solution if you can't get source or someone else won't fix it (sometimes even if you can, because figuring out how to set up a build environment to recompile everything is a hassle when you could just change a byte or two... especially if you're not necessarily going to "develop" any further) is to patch the binary. Maybe this is why I find RE tools on Windows have been far better than with Linux or other Unix-like environments.

Or even more frustratingly, you have the driver source and the fix is very simple but you can't actually use or redistribute it because you're not up for the five grand and six months it takes to get a certificate for code signing.

There are easy solutions to getting unsigned drivers working. I do notice that the Windows and Linux community attitudes toward this are different though --- the latter seems to be "no source, can't do anything" while the former is more like "no source, we'll still fix it". We're not all as helpless and controlled by Microsoft as you may think. ;-)

(I'm someone who recently patched a driver for hardware that I had absolutely no familiarity with before. It was literally a 2-byte change after about 3 hours, most of which was spent learning about the device.)


> The default on Linux is source code availability.

You live in quite the dream world.


With qemu and other virtualization technology, even software emulation in 2016 will be faster than real hardware was in the 90s. And unlike Windows there are no legal problems with licensing.


True, Linux binaries are not as backwards compatible as Win32 binaries. However, with Wine and Mingw64 being around, Win32 binaries are a totally valid way of distributing software for Linux.


Same here. A friend switched me to Debian in 2002 and I haven't looked back since. These days interacting with Windows (even 7 which is similar to what I remember) feels unpleasant.

I use a tiling window manager and again I would never want to go back.

My job does not require me to modify other people's Office documents and for reading them LibreOffice is acceptable. My own documents I usually write in AsciiDoc and share them as a PDF. For something that I really care about I will break out TexStudio and the result will blow Word out of the water.

Most of my time is spent in either Firefox, vim, or a terminal for which Linux is also ideal.

Gimp + Inkscape cover all of my occasional photo / diagram needs.

For video there really is nothing better than mpv (mplayer successor) and ffmpeg.

I do have a Windows 7 VM that I use for Outlook and Lync at the office and it works fine, but it's just another Linux app. ;-)


What tiling window manager do you use? I've been using Awesome but it completely messes up when dragging Chrome tabs on your non-primary monitor. It is very frustrating.

Or maybe the bug is in Chrome and has nothing to do with tabs and Awesome...


I also use Awesome WM, which is ... awesome. :-)

I think I see what you mean - is it the crazy flickering when you drag the newly created window around? I was able to get rid of that if I set the parent Chromium window status to "floating" (Mod+Ctrl+Space in Debian default config).


Sadly not that. What I mean is when Chrome is on my non-primary monitor the second you begin to drag a tag it pops out the window, goes to the primary monitor and refuses to be merged back into any Chrome window not on a primary monitor. It's really strange and quite obnoxious.


I only have a computer with Windows for two reasons currently - games, and for debugging frontend web code on an actual Windows machine (more uncommon - pretty much only has been for open source work).

Otherwise, I too have found Linux/OS X to be sufficient. However, for many of my friends, other reasons tie them to Windows, such as music DAWs.


I would pay an arm and a leg to be able to use Linux (specifically Ubuntu) as my day to day OS. I miss the tools, ecosystem, flexibility and most importantly the shell. I have been toying with Ubuntu 14.04 LTS for the past few months. Managed to get it up and running after an intensive weekend.

Few days later I forgot to wake up the laptop from sleep before disconnecting the external monitor.

All hell literally broke loose...


MS Office


Run a Windows VM on Linux (Windows 7 runs great), problem solved

Install, back up the VM image, a lot less problems than having it as your main OS


Or, if you want a great desktop experience that is really stable and has stood the test of time - run Windows as the host and Linux as guest.


that, and the Adobe Creative Suite.


MS Office is a good point. But:

MS Office is available even for Android these days.

Otherwise Wine should work, and ain't there some cloud MS Office these days?


Except it doesn't. Office 2007 is over eight years old and runs with significant bugs.

The cloud versions are incomplete and don't work with plugins like Endnote. The cloud version of PowerPoint struggle with movies and large resources.


The vast majority of people using MS Office would do fine with Libre Office or gnumeric and Abiword.


Not if they need to swap documents with other Office users.


Not true.

There are use cases where OpenOffice is a vastly better tool for the task at hand. For instance, reading a CSV file that happens to use extended UTF-8 characters with some fields that have important preceding zeroes. In these situations it is possible to import the data into OpenOffice without the prospect of it being mangled by 'clippy'. Sure you can create a new spreadsheet and import a CSV file from disk into it with the data read as 'just text' in UTF-8 but the people I send CSV files to do not do that with their Microsoft ways. Consequently you get so far in and realise you actually need to re-read the source data because it has been Microsofted with bizarre things like capitalisation.

Excel obfuscates data and obfuscates filenames. It also promotes arcane ways of working, e.g. vlookup things held together with blu-tak and string when a simple table join on the original data does what is required correctly with no hand-crafted nonsense.

Too often I see things being solved in Excel where a small bit of code does a better job of creating the report or things like Fusion Tables do a better job of fancy presentation.

I no longer lock in to Excel world, I don't see it as a professional tool.


That there are some things Excel does badly does not magically mean other tools provide decent interchange with Excel users.


This is, unfortunately, very true.

Try collecting some accounting records in LibreOffice Calc and then finding your accountant uses Excel. Time lost fixing the mess is probably worth more than the cost of MS Office for a single incident. If you need this kind of interoperability more than very occasionally, that alone could be a deal-breaker.

Also, LibreOffice 5 on Windows 7 appears to be a disaster. We installed it for the first time a few weeks ago, on one new PC at work just to try it out, and it's exhibited numerous very obvious graphical glitches, crashes and performance problems so far even just doing basic spreadsheet work. It also seems to have arbitrarily changed a bunch of things from earlier versions, such as the default colours available for colouring or highlighting cells in Calc. Maybe the stability is better on Linux, but even then the changes from LO4 are presumably still the same and just as frustrating.


Can you be more specific?

In regards to changing defaults, that's often the case. Microsoft do the same thing, possibly worse.


Sure. The first example we noticed was that we loaded one of our most important spreadsheets into Calc, and found that most of the colours we use to code different cells don't seem to match the default palette any more. Where we used to just click a couple of times on the toolbar, now it seems we have to manually configure the exact colour or use format painting. Someone in our organisation has to update this particular spreadsheet very often, and that kind of change is going to be horribly frustrating for them (or would be, I suppose, since presumably we're not going to actually migrate any existing systems to LO5 in its current state).


Is this just in the one spreadsheet, or is it across the whole app?

Can you change the default colour palette?

Choose Tools - Options - Charts - Default Colors

https://help.libreoffice.org/Common/Default_colors


All I can really say is that it was immediately apparent that a lot of the colours we used to use, which came from the default palette in older LO versions, aren't in the default palette any more, and if you go to the corresponding places in the format dialogs those colours do show up as "User" in LO5.

We'd have to look into the sorts of changes you mentioned if we were going to stick with 5, so thanks for the suggestions. However, given the graphical glitches and instability, which unfortunately make it borderline unusable on our test system, I don't think we'll be considering a larger scale migration any further until (I assume) some future updates that fix those things have arrived.

The glitches and instability appear to be across the whole suite, BTW. Basic stuff like drawing menus, toolbars and tabs is broken in very obvious ways, all the time. I'm guessing there's some fundamental problem with the routines LibreOffice uses to draw those graphical assets instead of the standard Windows functionality. Either that or there's some horrible conflict with the graphics drivers on the new machine, which is always a possibility but would be surprising at this point given how many other programs do seem to work OK.


That could actually be the issue.

Try turning off OpenGL:

You can do this by going to to Tools ▸ Options ▸ LibreOffice ▸ View.

P.S. I'm biased about LibreOffice as I have commit access and I'm working on the code at the moment. I do acknowledge its frustrating when things don't work, and I don't want to deny there are issues preventing you from adopting.

If it's not too much bother, can you file a bug?

https://bugs.documentfoundation.org/enter_bug.cgi


Sorry, turning the "Use OpenGL for all rendering" setting off made no difference.

I honestly wouldn't know where to start with filing useful bug reports. I'm a software developer myself, so I appreciate the need for useful information and ideally reproducible test cases, but at the moment we're seeing graphical glitches in everything from menu displays to dialogs to the tabs for different sheets, far too many different areas to isolate and investigate each one. It seems more likely that some combination of hardware/software isn't playing nicely on the test system, since presumably if everyone were seeing what we are the LO team would be seeing plenty of feedback already. If you'd like to investigate or try to triage it somehow, I can ask someone to get in touch by mail, and maybe that would lead to something specific enough to be worth putting in a bug report for more detailed investigation?


Sure, I'll do my best :-) even if I can't sort it out I can help point to the right places of maybe ask the right folks for suggestions on what's going on.

chris.sherlock79 at gmail.com is my email address, or hop onto #libreoffice-dev on Freenode and ask to speak to chris_wot (that's the IRC channel I hang out on the most frequently).


Not if you have to teach every new intern how to use LibreOffice when they know to do stats and calculations in Excel.


Gaming


I have a linux gaming rig. It's actually a dual-boot setup but I'm in Linux 99% of the time. Depends on the games you want to play, but there is a decent selection of games on Linux, and with frameworks like Unity it's pretty easy for devs to make Linux builds. Most games I play are native Linux builds, and then I have Wine for other games. I do use the windows partition as a last resort but it's rare (since building the computer I've used it once for GTA5)


More and more games are console "ports" (Both the PS4 and XbOne are basically an AMD x86 PC). And those that are not are "indie" games that work across all platforms supported by the likes of steam.


Wine and / or PCI pass through to a windows VM. The second one takes a bit of setup, but it works wonderfully.


It is still... A language for the quick hack, but not for the reader (or author) to understand a day or even an hour after the writing.


It's an Algol-based language. I never quite got what people "got wrong" with Perl code. Yes, I've seen bad examples, but those would've looked pretty much the same in C, Pascal or Python (deep nesting, bad names, overuse of regexps).

Back in the days one argument was using "grep" or "map" instead of explicit for-loops, but in this day of functional programming, that would seem a weird criticism.

Is it the type glyphs? ($%@)

References are a bit unnecessary, yes.


Yes, it's the type glyphs. (For me, at least.) Back when I was a DBA writing database back-up scripts in Perl, I could never remember how to dereference a value in an array or hash properly. (It reminded me of my problem with pointer notation syntax in C). When I first saw Python, I thought "Wow, it's like Perl, but without the confusing notation." I rewrote the scripts in Python and never looked back.


> I could never remember how to dereference a value in an array or hash properly.

I too struggled with that and came to hate it.

Perl 6 directly addressed this and some related problems.

First, sigils are now invariant - they're just a part of the name that signals which of the three data structure types a particular variable is. Second, there's no need for a `->` dereferencing op.

  my @array = 1,2,3; # `@` sigil means indexable var
  say @array[1]; # prints 2
It might be interesting to look at an example. Perhaps you could share an example of code in Python that illustrates some data structuring code that would be confusing in Perl 5 syntax, I'll try show what the Perl 6 equivalent would be, and we can see if Perl 6 really does clean this part of the language up.


Nested arrays are a pain on the arse in Python. Do I use extend or append? Then strings are treated as character arrays. I far preferred Perl's approach where you would reference or dereference variables instead.


This is because it was invented to be possible to use as an interactive shell. When you're working with an interactive shell, you want its language to be (at least somewhat) optimised for writeability, which by necessity means a lack of optimisation for readability. (Explicitness and some level of redundancy aid readability but hurt writeability.)


I believe Perl was meant as a replacement for shell scripts, as well as sed and awk script. It wasn't designed as a interactive shell. For example, the original perl man page says (quoting from http://history.perl.org/PerlTimeline.html ):

        Perl is a interpreted language optimized for scanning  arbi-
     	trary  text  files,  extracting  information from those text
     	files, and printing reports based on that information.  It's
     	also  a good language for many system management tasks.  The
     	language is intended to be practical  (easy  to  use,  effi-
     	cient,  complete)  rather  than  beautiful  (tiny,  elegant,
     	minimal).  It combines (in  the  author's  opinion,  anyway)
     	some  of the best features of C, sed, awk, and sh, so people
     	familiar with those languages should have little  difficulty
     	with  it.
The closest that perl had to an interactive shell was with the "-d" debugger. See for example this synopsis from 2007 at http://archive.oreilly.com/pub/post/writing_a_modern_perl_re... :

> One area in which Perl falls behind other languages is in its lack of a usable Read-Evaluate-Print-Loop. (perl -de 0 just isn't enough for me.) I've used Ruby's irb and a handful of Python shells, and I'm starting to become a fan of ghci, but I fall back on perl -e far too often for my sanity.

An interactive shell should, in my opinion, support more than a REPL. It should also implement job control, like suspending a job or starting jobs in the background. Perl definitely was not invented for that purpose.


That's the number one reason I don't code in Perl, I can't even read my own code 6 months later... Perl is not a programming language, it's closer to real languages and you can sort of invent stuff and it works...


Is it the for loops? The if statements? The function calls? I've written lots of Perl. It doesn't have to look that much different than any other language. If you find a terse 1 liner on the Internet and use that as a solution, you might not be able to read it a week later.


I suspect it's the wide range of special operators, variables and default arguments. Knowing, for example, that shift() shifts @_ by default. That <> iterates over @ARGV. That chomp chomps $_. Messing with @INC changes library search paths, etc.

Or, perhaps the syntax for dealing with elements in a somewhat deep data structure, like: push(@{$TV{$family}{kids}},"anotherkid");

Sometimes, people are just griping about regex syntax though, which seems disingenuous, since many languages use the exact same pcre expressions.

I like perl, but if it's been a while, there's definitely some back and forth with books to decipher something I wrote some time ago.


It largely depends from whom you learn. Most people google and run with the first thing google throws up and perl has many, MANY (due to its age) really shitty learning resources. Even the most popular one, the book Learning Perl, has as of its 6th edition (2013), grave flaws and miseducates newbies severely.

Some comment from the position of someone familiar with Modern Perl, which is largely what's practiced on CPAN:

> shift() shifts @_ by default

This is discouraged, precisely because it messes with @_. It has valid uses, but people tend to make sure it's clear from the code why it's used. (Mainly in OO helper stuff.)

> <> iterates over @ARGV

Almost nobody uses that, precisely because it's very unreadable. It only exists for compatibility reasons and was made to serve people used to sed/awk.

> chomp chomps $_

I haven't seen code use chomp in ages. I've never used it myself.

> Messing with @INC changes library search paths, etc.

This is rarely used as well. Mainly only when doing release engineering. It's also most often done via `use lib 'whatever';`.

    push(@{$TV{$family}{kids}},"anotherkid");
That is an outdated way to write that. In a decent modern style it would be written like this:

    push @{ $TV{ $family }{kids} }, "anotherkid";
And with a more modern Perl, this:

    push $TV{ $family }{kids}, "anotherkid";


Yes, I think the biggest issue with perl was the devil-may-care culture around it. That was far from the only issue, though. It had (has?) some gratuitous features which are bound to cause hard to find bugs, like implicit conversion of numeric strings to numbers in certain contexts. That particular feature was the reason I chose to learn python in the mid 90s, rather than perl.


> implicit conversion of numeric strings to numbers in certain contexts.

In Perl 6 at least, and to a large degree Perl 5 too, context only "causes" bugs if coders make assumptions that are invalid in Perl.

For example:

  say $foo + $bar
adds two numbers. So the context for $foo and $bar is numeric. So Perl coerces them to be numbers. If you didn't mean to add two numbers, don't use a numeric operation such as `+`.

If you want Perl 6 to make sure $foo and $bar are numbers already, then add a type to their declaration:

  my Int ($foo, $bar);


>>It largely depends from whom you learn.

In my case, when you learn :)


Your comment actually made me scroll back to the top to double check this thread was actually about Perl 6, not Perl 5!

Part of the point of Perl 6 was to address problems in the language and Larry (and/or those who wrote the original Perl 6 RFCs) considered several of the things you named to be problems.

> wide range of special operators, variables and default arguments

In Perl 6:

* The only special op I'm aware of is assignment.

* The only variables considered special are the "it" and "them" variables ($_, @_ and %_), the current match object ($/), and the current exception list ($!).

* There are still predefined variables, such as a DISTRO variable which contains an object representing the OS etc. on which Perl 6 is running, but I haven't found those problematic.

* Almost all use of default arguments in built-ins has been eliminated. The main exception I'm aware of is that subs and ops related to matching still default to operating on "it".

The Perl 5 <> op is gone.

> Or, perhaps the syntax for dealing with elements in a somewhat deep data structure, like: push(@{$TV{$family}{kids}},"anotherkid");

This would be something like this in Perl 6:

  my %TV;
  my $family = 'foo';
  push %TV{$family}<kids>, 'another'; 
> Sometimes, people are just griping about regex syntax though, which seems disingenuous, since many languages use the exact same pcre expressions.

This is talked of in Perl 6 circles as being ironic because regex syntax has been thoroughly cleaned up (and massively powered up too) in Perl 6.[1]

> I like perl, but if it's been a while, there's definitely some back and forth with books to decipher something I wrote some time ago.

One of the many downsides to Perl 6, imo, is that this back-to-the-book-oh-yeah aspect is still there -- but there's only incomplete doc and no books yet written by the likes of Larry.

[1] See, for example, https://github.com/moritz/json/blob/master/lib/JSON/Tiny/Gra...


>>Your comment actually made me scroll back to >>...check this thread was.. about Perl 6, not Perl 5!

Well, the context of the parent, and it's parent, was not Perl 6.


I'm a Perl newbie. What can you advise me to read (+exercises) to learn Perl 6?


IMHO Project Euler is a good start. There are solutions published in Perl6 to a large number of them.


Then you're doing it wrong. Do you write in the same manner in other languages, too?


I can write readable Perl by writing it as if it were Python, adding extra braces where required, a couple of use declarations at the top, and wrapping some system functions with clearer names. But if that's what you want then it's easier to just write Python. Perl's USPs like regex literals and $_ rely on you writing in an unreadable style.


So, Python regexps are suddenly more readable, even though they need additional layer of backslashes?

$_ does not rely on unreadable style. First, it's a common idiom, so it's in no way less comprehensible than list comprehension in Python. You just need to understand what it means. Second, it's not an obligation to use $_. I often avoid it if it doesn't make my code reflecting my intentions better. Funny thing, I do the same in Python, C, and Erlang.

And I hear about $_ and regexp literals as unique selling points (if I read your acronym correctly) only from people that don't really write in Perl.


> So, Python regexps are suddenly more readable, even though they need additional layer of backslashes?

Python encourages a style that makes less use of regexps. (And you don't need extra backslashes, you can use r'...').

> And I hear about $_ and regexp literals as unique selling points (if I read your acronym correctly) only from people that don't really write in Perl.

Nice ad hominem. Go on then - what's your USP for Perl? Why should one use it over Python/Ruby/...?


> Python encourages a style that makes less use of regexps.

Yes, even in the places where regexp would help very much. And even if you'll insist on using regexps, Python makes them highly inefficient or cumbersome (or both), because if you're careful, you'll either get code that needs to store compiled pattern somewhere vaguely related to the code at hand, or code that compiles the pattern on every use. If you are not careful enough, you'll get the two at the same time.

> Nice ad hominem. Go on then - what's your USP for Perl? Why should one use it over Python/Ruby/...?

Thank you, though I wasn't refering to yourself. It was pretty clear you don't write Perl much.

My reason to use it is because it was my primary language for over a decade. For other people? I don't know. All three are similar in what they can do and how does it look afterwards, as all of them are scripting languages in OOP land.

Combine that one and Perl's learning curve, and you'll get why Perl usage was declining over the last decade.


Scala uses [$]_ in very similar ways and everyone loves it.


"_" in Scala is not a magic variable like in Perl. It's a very mechanical syntax transformation, it just expands to "x => x" (or equivalent, and without colliding with any "x" that's already defined).

In practice maybe the use is similar. But having a comprehensible theoretical model makes all the difference in terms of making sure you can actually read the code and understand what will happen.


There is nothing particularly magical about $_. It is merely a global variable that is optionally used implicitly in various builtins, as well as scope-localized by loop control constructs.


Are you sure you're not just used to it? The description you've just given sounds very magic to me.


That depends on how you define "magic". I typically do so as "cannot be implemented in the language itself", which does not apply to $_, as it can be reimplemented in Perl itself. If i was bored enough i could create a module that sets up a $this variable that acts exactly the same, in pure perl.


Got some examples of your Perl?


Afraid not - I only ever wrote Perl for employers, for my personal projects I used Python.


I have seen just as much crappy unreadable Python in the wild.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: