You're effectively describing trickle-down economics. "Help out rich people so they pay more taxes which will trickle down to poor people" That's not a real thing. The economy is trickle-up. Income inequality is widening. San francisco has the worst income inequality of any city in a developed country: top 1% earn 44x more than bottom 99%. The number of home owners has actually decreased in the last ten years. Cost of living have increased in line with what the top 1% are making. The recession hurt low income far more than the top and yet we subsidized the shit out of the top by giving tax breaks to tech and wallstreet. We basically created a drag race between a ferrari and a civic, gave the ferrari a head start, and then sat around waiting for the civic to catch up.
I'm a social democrat. My point is that we shouldn't be dependent on the whims of private companies feeling charitable or not. If society needs money from them, then that's what taxes are for.
I don't think that is the OP's point at all. They aren't saying the tax rate should be low, they're saying that the social safety net should be handled by the government and charities rather than for-profit businesses.
I love how inmates get a better education and more support from tech companies than the adults who live in the bay area. The bay area has the worst income inequality of anywhere in any developed country. Bootcamps are a scam and cost tens of thousands of dollars and yet inmates, people convicted of severe crimes, are treated to free education. From a business standpoint this makes more sense because inmates have no bargaining power and can flood the coding labor pool, thus lowering wages for businesses. How is a boot camp graduate supposed to compete with modern-day slaves?
By the way the W.K. Kellogg Foundation is a eugenics group. What are they doing here? This program is so fishy.
I'm a bit unclear on your perspective here. Are you saying teaching inmates is wrong because they've apparently lost their chance at success in life because of their "severe crimes"?
Regardless of that perspective, here's a truth: most inmates will be released at some point. As a matter of fact, alot of the inequality you reference is a result of those with criminal backgrounds and the socioeconomic conditions that got them those backgrounds. We have a rich set of statistics about what happens to people in that situation: desperation -> crime -> prison -> repeat. Personally I don't like that world. I want to feel safe in my world, and anything we can do to short-circuit that recidivism cycle makes us all safer.
The "but our jobs! but our pay!" refrain is a common one, but our industry has WAY too many job openings for this to be a legit worry (the very presence of the h1b visa proves this). As for salary, as those jobs get filled, whether by bootcamp grads, or journeymen, or trained individuals with a criminal record - it will push income down for some. Some incomes are a product of scarcity, and you only need to look at a supply/demand curve to know what will happen to prices. In the late 90s, you'd spend tens of thousands of dollars to get a basic website. Now you can get one for a few hundred.
> The entire point of prison is to remove them from society.
That is the immediate effect, for sure. But that isn't the issue.
Unless someone is convicted for life without parole, or a death penalty, then they likely will be released into the general population at some point.
Do you want them to have skills that might allow them to support themselves, or do you think a near term investment in education might be cheaper for society and more humane than releasing them with a situation where they have fewer opportunities to make a living?
Like it or not, most inmates will be released. Patterns of recidivism are well established. If they reoffend, we will spend more resources on those prisoners. (To say nothing of the costs on society when they do reoffend) It seem reasonable to not want to spend money on someone who has already messed up, but the practical reality is there's a real cost to not making those investments.
Put a different way, this is about those who haven't committed crimes: those affected by future crime, and those who haven't yet committed crime but statistically are likely to (think children of the chronically incarcerated)
The point of prisons is to remove them from society until they can be sociable again. But I see your point, we should be putting more spending towards those who have never been convicted than we are now.
I'm unclear whether you're OK with spending other people's money but not your own. The fund supporting this initiative accepts donations. Have you donated? Have you volunteered? No, you haven't and you won't.
Why do you say that? I have supported educational efforts in prison, but not this specific one. I have personally hired at least 3 people who have been in prison.
Hi. I was a cofounder at Hack Reactor, an SF-based coding bootcamp that also provided the first set of curriculum and instructors for The Last Mile. I personally taught many bootcamp students in SF as well as inmates in San Quentin. Everything you're saying about bootcamps and the motives of people running programs like The Last Mile is total malarkey FYI.
You ever think about how much we subsidize businesses by paying for college ourselves? The older I get the more I think the university to corporation pipeline is a fucking racket. It also sucks for employees because here comes these kids who were allowed to learn about all the new technologies you wish you knew with zero distractions and other obligations, now comin' in hot on your heels to take yer jobs en masse, and because of the larger labor pool they all undercut one another's bargaining ability, lowering wages. If I owned a company and I was lookin to hire I'd be callin that a twofer: free employee training and hiring discounts!
University is a total racket. We pay these institutions to be treated like their low level employees to provide job training for companies. Corporations have successfully offloaded their training responsibilities onto a process that used to be much broader than vocational training.
And if you try to study some of those broader topics, you're a sucker - don't study philosophy when an extra accounting or STEM course would be a "better" use of your time. So you super specialize and then after 4 years of college and 5 years in industry you're burnt out you have few other skills, so the only option is to get back onto the education treadmill to bet on another highly specific vocation where you once again start your career as a junior.
Most STEM courses aren't vocational training and aren't super specialized. I think you might have a biased negative view for what STEM courses are?
Edit: Let me state it a different way that might shed more light on my point. A CS major can pass all of his/her classes with a perfect GPA and still be incapable of writing software ready for a production system (even at small scale).
The STEM degrees emphasize fundamentals that are rarely (if ever) used in day-to-day "real jobs".
For some value of 'super specialized' you are correct. For a value that includes the bigger perspective on our culture and what it means to live a good life, an exclusive focus on STEM is indeed 'super specialized'.
>For a value that includes the bigger perspective on our culture
Sure
> and what it means to live a good life
That's just self-aggrandizing bullshit. There is no class that will tell you what it means to live a good life. Anyone who thinks so is dearly lacking perspective.
>an exclusive focus on STEM is indeed 'super specialized'.
An exclusive focus on STEM will include the philosophy of science and what it means to seek truths about the physical world. IMO that has immensely more value in a philosophical sense than you seem to imply.
I believe you are proving my point. For example, the goal of much ancient philosophy was exactly what it meant to live a good life, and the theory of such was very well developed. Most of the culture you take for granted as 'common sense' is directly based on this philosophical development.
STEM at best tells you how to do something, but can never tell you what to do, or why to do it. For that you need philosophy, much more than philosophy of science.
This is a very pessimistic world view. I found in the engineering faculty that the subjects we were taught were very broad. However, I continued to learn on my own in my free time after I got my degree. I've been doing this consistently for the last 8 years or so. I now have completely new skills and much more depth on knowledge on CS topics than I had coming out of university.
The point is that while for you it is your choice and pleasure to spend your free time doing something with a direct career benefit, for others, there are often other valid and important uses of their free time and so there is a cost to that.
The question is if that cost an individual occurs if they choose not to spend their free time on career-related skills is ethical or good for society.
I believe this is the crux of the problem - it favours people with minimal external life factors or responsibility, and they’re quite often the ones to rise to power, therefore creating a “well it was good enough for me” sentiment lacking empathy.
While by contrast, there are some people who want to kick back and simply collect a pay check, there is a whole segment of people in the middle ground who are hungry to learn, but are stretched so thin that they can’t outside of work — a whole segment that isn’t being catered for, and therefore an opportunity exists to tap into this.
It's not realistic to expect to go to school for 4 or 5 years and then work for the next 30 without learning anything new. Or rather, not if you care about advancing and making more money. Maybe I'm lucky that I actually enjoy it, so it doesn't feel so much like work to me.
I'm a bit confused what you're arguing against in my point - you continue to super specialize in your off time. If you wanted to switch vocations from something in the CS domain, how much of what you now know and have self-taught would apply?
At my university engineers were offered two elective courses in the faculty of arts or sciences. That's not a particularly broad education.
A B Eng covers so much "basic" knowledge that you never really have time to specialize at anything. Doing physics and math courses is hardly becoming specialized in SE. It takes a long time and years of work afterwards to become specialized at something.
The good news is, you can do it without going back to school. IMO an SE will get little benefit out of going back for another degree. You'll get much more ROI spending your time contributing to OSS projects and making a name for yourself.
I think this is really dependent on your choice of field to enter into. In the tech space, so many engineers didn't major or necessarily even take CS courses in college. I have interviewed and hired plenty of people with diverse colligate focuses.
Sure, CS or STEM courses are probably really solid to pair with a non-CS major because it teaches you stuff that helps extend your abilities in your own field. So I can see why my friend that was going to school to be a nurse might have wanted to take a CS course or two instead of minoring in sociology.
CS, being still a bit of a wild west, still has flexibility, but you definitely couldn't take some nursing courses (which, let's be honest, aren't even offered) and switch to that after burning out in comp sci.
The courses I took were the same that anyone getting a major in that field would take, and I was required to take them beyond just an introductory level.
They were total jokes at my (admittedly) second-rate-at-best state school. Most of the gen-ed courses failed to go beyond material we'd covered back in 10th grade or so. The English courses were probably the nearest to being remotely "serious" since they at least expected writing and critical reading on a slightly-above-high-school level, usually pretty early in the course.
Yes by ultra focus on tech, we can provide a decent living, but lack the bigger picture and become the peons of those who know better, and can articulate their view clearly and logically.
E.g. look at any PG essay that tried to talk about broader philosophical or political issues and you'll see this limitation. His frame of reference is stuck in a recent enlightenment framing of the world. Granted, PG is indeed a great communicator in technical fields.
We were supposed to take "History of Technology" which I guess is supposed to be the corollary to "Business Math" classes or whatever. I really enjoy the humanities so I took all the real electives I could.
I disagree. I went to a state University and it really made me a better person. I worked in groups with diverse group of people and while it was tough I enjoyed the experience. Plus, it wasn't really that expensive considering my income now and before.
However, I guess non-technical/engineering degrees have different results.
I had a good experience at undergrad, but that was due to its divergence from the norm. It actually gave me the broader perspective by having me read thousands of pages of the primary literature for the Western canon, along with in depth critical group discussions of the texts, and learning to write coherent papers. Nothing in my CS degree impacted my life, except making me marketable. Much of the CS I could have picked up on my own, and very little have I actually used in day to day jobs, except the programming experience. On the other hand, the literature program has indeed changed my life.
This seems to be a very unpopular thought within CS culture and I find that really unfortunate. It feels like people are rushing to reduce their education, their lives, to optimized market interactions and that's a terrible lens for a human life. There may be an argument that it's a method of successfully navigating our society, thus enriching one's personal or familial existence, but it seems to me that would just lead to a poor societal structure with few common bonds among the people within it.
It's a side effect of no safety net, knowing that the slightest mishap could put you into crippling debt. To stand still but for a moment is to be trampled by the masses.
By the time you are in a financially stable situation, old habits are ingrained.
I like how a liberal arts education is "divergent from the norm" now. The primary function of university is to make people read for four years. Business degrees, CS degrees, essentially job training programs are a bastardization of the institution.
I think the classic ideal of a liberal arts degree is awesome... as a second or mid-life degree. The option to read and think in-depth and breadth seems to have more potential once you've lived a little more than the average 18-yr-old, just because you tend to have more experiences and viewpoints than a high school grad heading to uni.
On the other hand, by that point, you'll have a bunch of habits and ways of thinking hardwired that you did not choose for yourself. It also becomes something of a Sapir-Worf dilemma, where it becomes very difficult to even realize one's thinking has been shaped in this way.
My experience of interacting with older, more stable 'intellectuals' who do not have a broad background of reading is an acquired indolence towards foreign ideas and older ideas, subsisting on a shallow 'tolerance' as a sign of their broad mindedness.
My particular liberal arts program is "divergent from the norm," including modern liberal arts programs. Just about all programs, liberal arts or otherwise, are completely framed within an enlightenment view of reality, largely due to dogmatic materialism. Classes that do diverge from materialism have lost a coherent way to talk about an alternate worldview, leaving their terminology sounding very wishy washy and illogical, like a woo woo Deepak Chopra.
Off topic but that still sounds like it's framed within the enlightenment period. If you're reading books and valuing literacy, (ie individual interpretations of texts, as opposed to being told what a book means) then you're still framed within the "enlightenment view of reality."
By 'enlightenment' I'm referring to a particular worldview perpetuated to denigrate the Western tradition and broader philosophical outlook in favor of a focus on empirical sciences and radically egalitarian social mores. The basic idea of 'enlightenment' is there is no objective and learnable purpose to the natural world and human society, and instead once we learn how to manipulate the natural world we can subject it to whatever ends we desire. In general it results in an implicit rejection of the ontology and teleology discovered by philosophers like Plato and Aristotle. This rejection may be valid, but students are not even given a clear view on the matter so that they know what they are rejecting. Instead, they tend to be educated in the criticisms offered by enlightenment writers, and filter the rest of history through that very limited lens.
It's strangely refreshing to see this particular criticism of the Enlightenment; I'm much more accustomed to hearing criticisms from the postmodernist direction. I disagree with your statement in a previous post that their arguments are incoherent, particularly the early exponents like Foucault or some of the Frankfurt school. I'd also point out that much of the Enlightenment tradition is not ontologically materialistic; in particular, German Idealism embodied in Kant, Schopenhauer etc. stands against materialism.
Based on what you're saying here, are you arguing for a kind of Scholasticism?
Finally, your criticism of a "radically egalitarian" view is somewhat perplexing to me. Would you mind expanding on that point?
In my opinion the idealistic variant of the enlightenment is conceptually not significantly different from materialism. The big thing is rejection of teleology, which also results in the radical egalitarianism since there is no longer a purposeful ordering to reality and no longer a natural law.
And yes, a teleological philosophy like scholasticism makes the most sense if we are trying to figure out the best way to live. Otherwise we just end up with the specious word game philosophy that everyone hates
That's sortof what the enlightenment was about... The enlightenment period was a decentralization of information caused by the reinvention and widespread use of the printing press in europe. During the dark ages europe's literacy rate was comparable to pre-mesopotamia. The fall of the roman empire lead to a fracturing of european civilization, the near-total loss of literacy, latin fractured into a dozen languages because priests wrote and read at a first grade level, misspelling words, reading with one finger slowly scrolling the text, mouthing each word phonetically...Ancient Greek texts were completely lost for a time...
Because nobody could read and copies the bible were sparse the catholic church was the single source of word of god. The printing press changed things. The bible became widespread and people read the bible for themselves. With that came an important shift, that one's own interpretation of a text was a valid interpretation. Tons of important literary works became widespread. The middle class valued literacy and saw it as a ticket to wealth and began teaching their kids to read and write competitively at younger and younger ages. They invented the education system we have today; the entire idea of a sequential learning system based around books, and becoming an adult when you could read at a certain level (as opposed to the catholic belief that you were an adult when you were old enough to fight at age ten), that was also the enlightenment and romantic period. Protestantism came about because people valued individual interpretations of the bible, which the catholic church had serious qualms with since that was their entire claim to authority...
So the fact that you grew up in a family which valued literacy, which sent you to a university where you spent four years reading books, and then came out of that with your own valid and rational ideas about what those texts mean, and your rite of passage into adulthood is based on your ability to read and write at a university level, that is still very much framed in the values of the enlightenment.
The precise narrative you just articulated is that of the enlightenment in the 18th century, which is a period much latter than the invention of the printing press and Protestantism.
A good book for you to check out is Rodney Stark's "For the Glory of God", written by a secular historian debunking much of the above narrative.
The fact that many educated today take your narrative for unarguable fact also illustrates the problem. The 'enlightenment' narrative is ironically very self limiting.
The brutality of the dark ages has been debunked. The timeline I just gave you about illiteracy, the printing press, the enlightenment, and our 400 year old education system remains in tact. Neil postman's a good source for the history of education (see "The Disappearance of Childhood") or you can simply wikipedia it.
https://en.wikipedia.org/wiki/Dark_Ages_(historiography) There's some graphs that show how the enlightenment coincides an exponential growth in mass publication.
As far as protestantism and the printing press being invented prior to the enlightenment, yeah. Without widespread use of both you don't get the enlightenment for reasons I mentioned previously. And neither were really new ideas, either. Ancient greece had the printing press, high rates of literacy and a belief in interpreting texts for yourself, but these ideas were lost during the dark ages.
Hmm, I've received different information than yourself. There were more printed books, but that doesn't mean there was significantly less learning and literacy that came before, although a different proportion of the population was literate. And lack of general literacy does not necessarily entail lack of learning or understanding. For example, much of the iconography comes from that era, and the lay person was taught through imagery and liturgy, not necessarily to their detriment. As far as I know, the university system we know today came into being mostly within the context of Catholic Church's clericalism and much of the great philosophical synthesis came about during that time, especially with Thomas Aquinas.
At any rate, we are obviously referring to different things by the term 'enlightenment', definitely different historical epochs.
People going to university in order to get a good job are playing the wrong game. Anyone looking primarily at the "cost vs. future earnings" for any particular program or course in a university setting is
1. not going to find a good economic deal
2. not going to get any true value
3. not going to have fun
You're better to take a 2-year programming diploma or go into the trades if you want the highest short/midterm pay-off.
If you're playing a longer term strategic game and actually enjoy learning for the sake of growing (i.e. you do it on your own regardless) look at the career-long pay-off.
I went to community college but didn’t get my Associates because the final class was me paying to work in the computer lab. It was so mind numbing I just couldn’t bring myself to do it.
In france, companies all pay quite a lot of money to some fund. The fund’s purpose is to finance employee education if they want to switch careers.
For instance, my friend was a mechanical engineer at Total, and after 3 years left to go study ML 1 year. Not only was the school all paid for by this fund, but the now student got a decent share of his salary everymonth. Best of best, he can go work for another company after his degree without any problem.
I thought that was a great idea. Although in practice, not every one is eligible to this particular program. I forgot how its called, CIFRE maybe
this particular degree isnt paid for by the government like the rest french education. Although it is taught at a very famous engineering school in france
The name got changed several times. CIFRE is a PhD course paid for by a company and subsidized by the state. So you get to do real work and get a PhD.
The thing you are thinking about is probably Compte Personnel de Formation (CPF) which was called Droit Individual à la Formation (DIF) a few years back.
Note that this is not some incredible sum of money, however a year in a classic STEM university costs a few hundred euros in France, so what you get from CPF is quite enough.
For five years of engineering school I spend about 3500 euros which included insurance. A full pension with private room costs a bit more than 300 euros per month. The difference with US education prices is just staggering.
You're correct but the difficulty is finding an alternative. Training employees who are then free to take their skills to another company that didn't bother with training gives that company the ability to lure workers away with higher salaries. There is a bit of tragedy of the commons in the skilled labor pool.
Q: What happens if we provide education and training to our employees and they leave?
A: Well, what happens if we don't, and they stay?
Growing a company's knowledge and skill base is an investment, not charity. Companies that don't do it reap exactly what they sow -- they're the same companies whose CEOs will otherwise loudly complain about how difficult it is to find skilled employees, especially at a senior level, and decry the terrible state of universities. As if everyone else just stumbles upon people with twenty years of experience in a particular niche on the street.
Yes, some people will leave. The smart thing to do is to convince as many of them to stay and to stay on good terms with those who leave. Keeping a loyal employee base whose knowledge and skills remain largely unchanged after joining the company doesn't provide any kind of meaningful growth.
HR loves to trot out this saying, attributing it to the enlightened CEO or such. In my experience it's about 50-50 if someone stays because we offer advanced training or leverages their new skills to get a new job.
Well it's also a pool, meaning it rotates, you get some, lose some, get some, lose some, etc. Maybe a dev will take 2-3 jobs to learn the trade (PHP here, Js there, some fundamental web stuff to top it off, and here's your "professional-grade developer". Great.) But you get to hire equivalent devs at each step, then it's just about $/skill.
Now, if all companies made it part of their "offer" to train people "enough" (say, 1d/w), then you'd expect all the workforce to become more qualified, better in time and in age.
You could actually pay/recover the "investment" of training the equivalent of university/grad/postgrad/etc for all employees simply by the fact that everyone else would do it too (and it would certainly lower wages a bit for the early years of these newcomers, since they'd skip the idling 20's decade of many youths currently).
I don't know, it's clearly not something you could do overnight or even over a generation, it's likely to be deeper and more 'revolutionary' than that in people's minds; but mathematically, economically, it tends to make sense (we've done that for years with "guilds" and "companions" in the medieval ages and actually since forever in some trades).
I think the current mainstream / massive education (take hundreds, thousands, and grad them each year) is just the result / need of industrialization (requiring an educated workforce), a novelty of the late 19th and 20th century.
I think the cursor is moving and the explosion of alternative means and times/ages of learning is a strong indicator of that.
One approach that reduces the tragedy of the commons:
Some places require spending a certain percentage of payroll on training by law, failing which the employer must pay the difference to a government training fund. I live in the Canadian province of Quebec which is such a place. I think at least one major tech city in the US has a similar law, though I'm not sure.
Plumbers and electricians don’t go to school for multiple years to learn a trade, they apprentice - learning on the job, until they get to the point where they propel themselves forward.
Well, no (or at least I guess it depends which educational system you are talking about).
On a few educational systems, you have dedicated curriculums for technicians and artisans/craftmen, with theory in classes, practice in labs and internship on the job.
Even if learning on the job is a big part (roughly half of the training), it's not purely that.
> You ever think about how much we subsidize businesses by paying for college ourselves?
1. Generally, education leads to productivity.
2. College education is typically tranferrable -- learning CS topics improves worker productivity regardless of who they work for.
3. Education is sticky to the individual; it can't be repo'd (or confiscated by fascists / nazis).
4. Employers generally try to match wages to productivity. Even if they pay as little as possible, in a fair market they will have to be prepared to bid close to worker productivity or lose out to a competitor who will.
Given these factors, I think the status quo is going to do a better job optimizing things than requiring employers to pay for training. When deciding who should bear the costs of training, it's appropriate to remember who the benefits accrue to. It's not only fair, but also ensures the incentives are aligned. When the topic is general and tranferable, the benefits of training largely accrue to the trainee.
Which is why I'm sitting in a training class about how our software works and code review culture, and not one about Python or Go.
The state de-funding of the university is great for corporations in a lot of ways. First, as you note, it's a way of getting you to pay for your own job training.
It's also a clever way for tech companies to externalize a lot of their R&D costs—because academic labs often rely on private grant funding more than the state, corporations can determine research priorities by extending grants, and then have the costs of that research partially subsidized by tuition-paying students!
This is an interesting talk on the subject. Haven't listened to it in a while but it really shaped my thinking about the university while I was a student: https://wearemany.org/a/2014/06/fall-of-faculty
Not to undercut your point too much, but no one comes out of university learning about the latest and greatest tech unless they went to a graduate program where they did research in tech. They may have played around with it more on their own time, but most universities aren't teaching cutting edge stuff.
And yet so many dev interviews are focused on the things that they grill you on in CS classes for four years, yet most devs will then hardly touch for the rest of their careers.
You are correct. But that also frees us to study what we want. Which is why some people end up studying Liberal Arts and have very little employable market demand.
When corporations start sponsoring degrees you will see a lot less "fluff" in those forms of higher education.
not really a faster road as anyone stuck in an expresslane behind a prius will tell you, but a less congested road free of peasants. Not really sure how this analogy syncs up but just throwing in my $0.02.
If you want net neutrality you should be excited by companies building private networks because it means there's a competitive market that no one ISP or government can control.
Are side projects really that big of a deal? This is tech. Everybody has side projects. Can you imagine if a herman miller furniture designer was sued for making chairs on the side?
> Can you imagine if a herman miller furniture designer was sued for making chairs on the side?
If the contract they signed with Herman Miller expressly forbade them from doing this, yes, I can totally imagine that. The problem isn't the side project, per se -- it's starting a side project that is in direct or indirect competition with your employer. If you work in the Alexa group at Amazon, for instance, they're probably not going to care about a "side project" of selling sparkly pony dolls on Etsy, but if your "side project" is developing a new voice assistant, you're going to have a problem.
Yes, generally if you work full time, your employer contractually owns any IP you produce unless you disclose that you're involved in outside projects.
They might own what you do on company time with company resources, but full-time employment does not entail your employer owning the intellectual property you produce on your time with your own resources.
But my understanding is your employment contract can pre-assign that property to your employer, and that contract will be enforceable in (e.g.) California if the IP relates "at the time of conception or reduction to practice of the invention to the employer's business, or actual or demonstrably anticipated research or development of the employer".
Totally depends on your employment contract. Plenty of large places and even smaller places take a “we own everything you create” clause.
Now, now enforceable it is often depends, but the Oculus case is just one major example of how a company can assert ownership of IP created outside of work hours.
Right, I was addressing the OP who said that full time employment means that your employer owns the intellectual property you create on your own time, which isn't the case by default.
And for large companies, even if they don't "own everything you create" they often have their fingers in enough pies for anything you could come up with to be construed as being a conflict of interest.
If my side project is something that benefits from (in this case, perhaps even 'consists of') the intellectual property I'm creating for my current employer, and is also something that I intend to turn into a new business ... that seems quite a lot different from "I do some open source contributions" or "I make some furniture for home".
"I make some furniture from home" is a completely different analogy from the one i just made. Either you didn't get it or that's willful contextomy.
Every company started by someone uses experience they've generated at a previous job. These contracts effectively make anyone starting a company a breach of contract. And just because it's in a contract, doesn't mean it can't be thrown out by a judge if the terms are too unreasonable. These terms are too unreasonable.
Also the concept of "intellectual property" is so misunderstood and abused by the legal system. Originally it was meant to prevent people from writing books that tail on the success of another person's work, like trying to get paid for harry potter fan fiction. It doesnt mean that after being a fiction writer for one publisher the publisher subsequently owns all fictional writing you do for the rest of your life. Prince should've had to change his name to Artist just so he could write music again. Maybe Nuvia's CEO needs to needs to change his name too just so he can continue making microchips.
> Every company started by someone uses experience they've generated at a previous job.
No-one (at least I don't think anyone) is suggesting that knowledge, experience and skills belong to your employer.
But if I'm a video game developer, and I invent a new shading technique for video game graphics while I'm employed at BigGameCo (whether at home or at work), and I have signed a contract that assigns ownership of my inventions to BigGameCo, then that contract is generally enforceable (again, according to my non-lawyer understanding) and that invention belongs to BigGameCo.
I'm not trying to say the line is always going to be clear but skills/experience/knowledge is fine; work-product is not. Bring your sales know-how; not your Rolodex. Bring your software architecture chops; not design documentation; etc.
>No-one (at least I don't think anyone) is suggesting that knowledge, experience and skills belong to your employer.
That's the whole argument. There's literally no other argument. There is no separation between "previous body of work" and "experience, skills, or knowledge." I think you missed the, "Prince had to change his name to Artist formerly known as prince just so he could continue making music," analogy. That's exactly what's going on here.
>But if I'm a video game developer, and I invent a new shading technique for video game graphics while I'm employed at BigGameCo (whether at home or at work)...
If you're a pioneer in shaders, it's because you've spent years of time and effort trying to understand the problems associated with this one specialized field. You are going to continue being a pioneer in shaders long after you leave your current employer, because that's where you are the most competitive, because that's where all of your knowledge, experience and skills are. To change fields now would be career suicide. You would no longer be a specialist. Your years of knowledge and experience in shaders would lose all value if you decided to dig ditches/whatever alt line of work you go into, and that's what these contracts are forcing you to do: brave a job market where you have no advantage for your time spent at your previous company.
Under these types of contracts, you're not allowed to move forward with your career trajectory after leaving a company. You would be building off your previous work that you did with them, and you're approaching problems with the same solutions you already came up with. That means your old employer owns the rights to all of your subsequent work. Same thing happens to musicians. Since every song they write is an iteration of their previous body of work, if musicians try to leave their record label and can be sued for the rights to every subsequent song they write. Prince had this happen and changed his name to "Artist formerly known as prince." to skirt around the contract. Ridiculous solution to a ridiculous problem. At the time I just thought Prince was being crazy. Apparently not.
I'm also not talking about what types of contracts have been enforced in the past, obviously these companies keep trying to use these contracts because there is precedent, but there's precedent for fucking everything in this country and I could write a historiography of court-ordered fuckery if need be. What im trying to say is that these types of contracts have been thrown out in the past for being unreasonable, and should all be thrown out in the future. This was not the intended spirit of any law allowing people to own "intellectual property."
There's a lot of grey area between "startup" and "side project". I do know quite a few people who operate niche cloud services that pull in a few thousand dollars a month. Certainly not enough to build a company around, but the $ per hour works out pretty well. I could see how someone might want to quit their job to work on turning their side project into a startup.
That said, nobody I know over 30 who works in tech does their "side projects" in tech anymore. We've all moved on to kids or hobbies that allow us to escape the tech world like music or painting.
I think side projects over 30 will be more common pretty soon. Income inequality is getting worse and worse, other industry incomes are falling like crazy and lots of people are career changing to tech.
I’ve had contractor coworkers get those sort of provisions in the employment contracts striked but they had to hire lawyers to examine them and this is in California.
I’m lucky my employer has a really fair moonlighting policy (basically, I can’t use company equipment for my outside projects and I can’t get paid to talk about stuff I explicitly learned at work), but many large tech companies assign ownership to any code you write, whether it’s a side project or not.
How many algorithms are there in chrome alone? I remember when people realized that they could game Facebook shares for higher rankings on chrome and for a while buzzfeed top ten lists outranked Wikipedia every fucking time. I guess that’s still going on. What a clusterfuck search results are nowadays.
If anyone builds anything, please make it so algorithms or queries are archived. I hate how I can’t find anything on the internet that I searched for and found years ago. Its like the history of the internet evaporates every year. I don’t even know if some websites still exist or if I simply can’t find them because rankings are terrible.
I’m to the point that I haven’t been on a new website in years. How do you find new websites in this day and age when the same websites are ranked at the top every time?
OP may be confusing Edge/Brave/Opera with Firefox here, so I can't really blame them for having had the impression that Firefox and Chrome share the same engine.
And in at least one regard it's entirely true: on iOS, everything uses WebKit.
Nope. It was a fork of WebKit which was a fork of KHTML, the engine powering Konqueror. KHTML was developed by the KDE team.
Edit: It was widely speculated that Apple would use gecko for Safari and a shock when they announced they would use the relatively little known KHTML engine. The decision was based on KHTML having much cleaner code. I haven't looked at the Mozilla code in many years, but it was pretty gnarly back then. Lots of old cruft from the Netscape days. In comparison, KHTML was beautiful.
Ken Kocienda's book "Creative Selection: Inside Apple's Design Process During the Golden Age of Steve Jobs" has more details about the Safari team's evaluation of Mozilla code and KHTML. Ken was engineer #2 on the Safari team.
I went looking into this, and it appears that I was confused. Google was a big contributor to Firefox in the early naughties, which is the period I was thinking of. Chrome didn't come out until a good while later.
FF just got web socket inspectors so I really don't know what chrome dev tools has now. Everything about ff dev tools is better. Especially looking at css layouts.
As a web dev, I find the FF dev tools a good bit less performant, but I still try to use it always without reverting to Chrome.. The one feature I have not found in FF dev tools is Search, this is very handy for find random JavaScript on a JS heavy site.
This has finally allowed me to switch. I’ve been trying to since quantum came out two years ago, but this was always a deal breaker.
Firefox is generally still a little buggier and less performance in my experience, but not so much that I want to switch back. Hopefully they can stay relevant.
The dev tools available in the Firefox developer edition are honestly way better than chrome's in every possible way (as far as I'm concerned) recently switched myself and couldn't be happier!
Yes but a blunt counterstatement doesnt make for good reading. Far better is to provide some substantiation so everyone can learn as your more highly voted siblings did!
targeting ads is only one teeny tiny use case for user data. Building general AI is a race and the more user data you have the faster your AI will learn. You definitely want to limit the amount of data your competitors have.
Or... if most knowledge work happens on a computer, and you just want the AI to replace knowledge workers, then maybe how people use computers is the only thing the AI needs to know.
Probably true for AGI but the overloaded term "AI" could be applied to learn from how people use software to _carefully_ suggest improvements.
Usage stats could help you improve a user interface, teach users new skills (eg: "you always do this, here's a shortcut to save you time"), perhaps more personalized operating system interfaces and defaults.
You probably don't need a fancy pseudo "AI" system for any of these, but that's the current gold rush so someone's gonna do it
That's pretty absurd considering the average american has a screen in front of their face 8-10 hours a day. You're basically saying what people do for 8-10 hours a day, every day, for their entire lives isnt useful information to AI.