Hacker Newsnew | past | comments | ask | show | jobs | submit | viddi's commentslogin

Not quite. The very first German bash.org clone is https://bash.pilgerer.org/


It should be noted that the malware bundling was done when SourceForge was owned by DHI Group, Inc. And now, for many years already, SourceForge has switched owners (BIZX/Slashdot). They have undone the bundling and are now trying to manage it like it was managed before. It seems like it's going well.

I would consider SF a viable Github alternative, but the bad reputation caused by a temporary owner just seems to stick forever.


The Sourceforge UI and overall experience is still a decade or more behind the experience on GitHub and most of the other modern and maintained equivalent sites/services like Fossil, GitLab, Gitea, etc

That ancient feeling UI doesn’t win them a lot of forgiveness, if the only notable change has been “we took away the malware” and the site continues to remain stagnant, SourceForge will continue to feel very inferior to more modern alternatives.


Just goes to show how much damage you can do to a brand by betraying your users. I still won't use SourceForge, fuck them.


SF's development was also essentially stalled during that period, and the current owner has to do much more things other than (proper) management to cope with competitors. If you need a proof, compare SF with Gitea/Forgejo.


There is a service that produces descriptions of each scene:

https://www.videogist.co/

But I don't know about searchability.


Given that E2EE messengers usually require being run on a smartphone as primary device, my guess is that they are trying to push the last remaining non-app-and-web-only users to their messenger app.

I'm one of them and I don't like this.


The end-to-end encryption also works on the web. I’ve used it and it’s excellent. You need to use a PIN to access your past messages from their backup HSMs, but other than that it’s completely transparent.


If I understand the parent comment right, this was an argument against ProtonMail's End-to-End Encrypted Webmail 5+ years ago.

The argument being that some assurances typically associated with E2EE (that "even we can't see what you're doing") are shakier without a disinterested third party serving the application to the user. If you have some target user `Mr. X`, and you operate the distribution of your app `Y`, you could theoretically serve them a malicious app that sidesteps E2EE. And since it's just a web app: the blast radius is much smaller than if you were to go through the whole update process with Google or Apple and have it distributed to all users.


Do you know if E2EE also works on the web without having to install the app? That would be novel.


Yes. It does.


??? FB Messenger is available on facebook.com ?


Yes, and my guess is that they are planning on removing the standalone messenger from the web version. You'll probably need to have the FB Messenger app installed on a smartphone device in order to use E2EE. That would make it impossible to write messages on the web version (i.e. facebook.com) without having an app installed. I currently do not have the app installed and am able to write messages on the pure web version of FB on desktop. My guess is that they are enabling E2EE to get the last remaining desktop-only-and-website-only messenger users to install the app. Hope that cleared it up.


According to the article, they went through a lot of trouble to make it work in web browsers. It would be odd to drop it after doing that.


Again, my point is not that FB Messenger will stop working in the web browser altogether. My point is that FB Messenger will stop working in the web browser if you don't have the FB Messenger app installed on your smart phone as the primary device.


OA mentions bringing E2EE to web clients


In a way that works well on low power mobile devices?

Most people I know using FB messenger do so on desktop via facebook.com and the app on mobile. I don't see them removing the former any time soon but if the web only version still exists for mobile users perhaps that will go.


You can't use the web version on mobile, it tells you to install the app.


Or if you have to use desktop mode in your browser...


WhatsApp (also by Meta!) supports E2E encryption on the web app.


I can only seem to get the report for Germany. Here is the bot's reply (formatting omitted):

>No transparency report is available for your region. If any IP addresses or phone numbers are shared in accordance with 8.3 of the Privacy Policy[0], we will publish a transparency report within 6 months of it happening and will continue publishing semiannual reports.

>Note: for a court decision to be relevant, it must come from a country with a high enough democracy index[1] to be considered a democracy. Only the IP address and the phone number may be shared.

[0] https://telegram.org/privacy#8-3-law-enforcement-authorities

[1] https://en.wikipedia.org/wiki/Democracy_Index


I have been using Tumbleweed for a few years now, but while it seems like a stable rolling release distribution, I am not quite sure about the "rolling release" part. Each month, a new snapshot comes out, which upgrades every single package you have installed, regardless of whether there were actual upstream updates. With a full Texlive installation and just a few more suites this amount to roughly 10,000 packages and over 5GB that need to be downloaded and installed each month. This a) kind of defeats the rolling-release aspect for me, b) takes a few hours, and c) feels like a cheat for the sake of stability.

Between those snapshots you might have bleeding edge updates for all the packages, but even then I do encounter package conflicts way too often. Well, on the upside, at least they are detected.

So yes, it is stable, but it comes at a price.

Apart from that, the community support felt mediocre, at least a few years ago. The most visited platform was a bulletin board forum with very little interaction. When I had trouble installing KDE, it took a few days until someone suggested the correct diagnostic tools. This is bad for being the testbed of a commercial distribution. In the end, I just installed Arch, which packaged KDE better than Tumbleweed did.

But on the other hand, maybe only if you use a distro long enough, you get to see the downsides, and each one has them.


There are new snapshots at least once a week. While there are large updates every once in a while, those are usually due to gcc or glibc upgrades which require a rebuild of most packages -- which doesn't happen every month. If you actually have upgrades of every single package every month, you should open a bug report to figure out what is going on -- that is absolutely not normal. On my machine I usually see 10-30 packages per update, with some updates hitting ~100 packages -- anything more than that is quite rare. Large rebuilds should be uncommon, though some packages might do them more than others.

There are quite a few things I've grown to dislike about Tumbleweed after using it for the past 7-8 years, but the upgrade experience is not one of them.


Great, thanx for telling. I'll stick to Leap. Update fatigue. We're using it on servers and I'm looking for something to replace Ubuntu on the desktop. I've grown tired of the usual Ubuntu antics like snaps, ads in apt update and the convoluted /etc config hierarchy inherited from Deban. SuSE is structured way more logically and makes more sense even if it's a rpm distribution. I don't want to use a rolling distro at work to break things just when I have to deliver stuff or fix time pressing issues. I've tried Debian 12 when it came out but Firefox was unusable, some very annoying focus issues on forms.

I'd like to know if the Gnome 4 in Leap was usable. If I do a search on software.opensuse.org on gnome-desktop it only turns out packages from tumbleweed and experimental packages from SLE-15-SP2 which looks quite ancient.


You can always run Debian testing on desktop. I'm running the same installation more than a decade now. It's not "bleeding edge", but recent enough. Things slow down during freezes a bit, but it's a rolling distro at the end of the day.


>Each month, a new snapshot comes out, which upgrades every single package you have installed, regardless of whether there were actual upstream updates.

A) Snapshots come out far more regularly then once a month. Going from the mailing list we've actually had a snapshot released every day since the 4th. I'd say the average is one every three days but that's just from the top of my head.

B) they do not cover every package you have installed: They don't even know what packages you have installed, snapshots are cut on the repo side. It's true that packages get rebuilt a lot but that's because either the package updated or a dependency of the package updated that caused the package to be rebuilt.


>So yes, it is stable, but it comes at a price.

That's called a trade-off, a very fundamental concept in pretty much everything in life.

And no, snapshots don't come out every month. They come out ~5 times every week. The most number of packages I've had to update was maybe around ~3500 and that's after 6 month of not upgrading my system.

I agree for "true" rolling distro enthusiasts, Arch is still the top choice, but Tumbleweed is great for those seeking to use a rolling distro without sinking too much time into configuration.


If you're talking about Jitsi Meet, and to add another perspective: I have tried to install it and failed. Specifically, you could see that other participants have joined, but you could not see their video.

I have tried to set it up in different ways multiple times, following all the steps of the fantastic documentation meticulously, with full server resets in-between, researched the error messages thoroughly and even asked in the Jitsi forums, but there was just no way.

But if it works, it's a great piece of software that's easy to set up with really great documentation. I'm blaming the fact that it didn't work out on the hosting company (Strato) and their wonky VPS.


Flash games and animations often had a lot of static assets, like sound, bitmap images, or fonts. You could data-uri all that in your html file, too, but with all the base64 overhead it might be a good idea to compress everything afterwards (e.g. SingleFileZ, see my other comment). Of course only if file shipment is an issue for you, otherwise relying on your server's gzip compression might be enough.


Try xz compression and put in a blob and then and put in wasm (C or rust for best performance). You can inline the wasm in single html file along with js and css. At load time, in wasm, you can decompress and instantiate the elements/objects/records from parsing the blob. You can keep in wasm or copy/move to js land. Sounds like a lot but it's pretty easy.


Although there is no agreed upon standard for encapsulated single-file web content, you can try to run your finished product through SingleFileZ or single-filez-cli, and ship that around, or embed it with an Iframe. I'm just not sure if it's suitable for games or application that load static assets upon certain interaction.


Maybe I'm going out of a limb here, but wasn't the whole point of CSS to not have to touch the markup when changing the styling? Not trying to talk Tailwind down, I have just missed a lot of the evolution of web design in the past 15 or so years.


I've never understood the point of separating markup from the design. I've seen CSS Zen Garden and... neat? But I don't get the point in real-world use. I've never had to redesign a UI in a way where I'd only change CSS and not also be moving around markup in the process. And once you're changing both of those things, there's the overhead of cross-referencing and keeping names and hierarchy in sync. At that point I don't know why I'd want the 2 separate.

The most magical moment for me when I started using Tailwind was realizing you can simply cut-and-paste an arbitrary block of HTML from one place to another and have it just render exactly as you'd expect. You don't have to additionally copy-paste a block of CSS, and then fudge around with the selectors because in the previous place it expected to be in some parent container or what have you.


The question of where to put the styles boils down to "do your content and style people work in completely separate teams?" For a newspaper, this is true. For someone who got their mate to set up their blog, this is true. Early days web was mostly like that. Not any more. If it's the same person doing it, put it in the same place. Separation will just be a hindrance.


It still is mostly like that, the content is not the html, the html is the presentation layer, as much as the css.

The content will be inserted from an API or with a templating system.


I want to see the company where the HTML and CSS are handled by separate teams.


That reminds me the pitch for the first time I discovered css zen garden: "you can redesign your website without changing the HTML!"

I sort of liked it at first. I even added the opportunity for my users to modify the CSS to add new styles, and share these to other members of the community via a configuration page. It was a neat thing but not many people bothered and in the end it made everything more complicated (and I guess I was really scared of vulnerabilities the whole time, what can go wrong with letting your users set the stylesheet.)

Worse, I think this new approach broke the web as it was: an explosion of flash-layout or photoshop--or-fireworks-cut-layout websites. It used to be beautiful. Now the web is just a series of websites styled by CSS. It's sad.

phones (and their screen size + apple killing flash) really killed the beauty of the web IMO. Now everything's an app.


Well you have that CSS zen garden thing nowadays. It's all the different media, screen sizes, user preferences. And instead of writing multiple classes that defines styles for different outputs you just have one and let CSS handle the differences. No need to duplicate classes, just semantic names.


it really only would make sense if we were serving pure xml and the webpage was the api for literally every webpage.

since that is not the case, tail wind is great, even the biggest complaint about tailwind, bloated html size seems like something you can just compile away.


> it really only would make sense if we were serving pure xml and the webpage was the api for literally every webpage.

This is, of course, XSLT, which was supposed to take over the web at one point.

https://en.wikipedia.org/wiki/XSLT


Yes but the way we do HTML has also changed–for example by using JSX and reusable components. The issue of having to change markup in a lot of places to change styling doesn't exist anymore if you abstract that away in reusable components.


And using tailwind without a reusable components system is cumbersome to say the least. What is suggested in the documentation is to use editor features (multi-cursor, research and replace, ...) https://tailwindcss.com/docs/reusing-styles#using-editor-and...


Yes. Tailwind inverts that axiom to great effect. It's leaky on purpose.


Everything on the web became a lot more dynamic. Stuff like web apps obviously but even relatively straight forward info pages end up with dynamic filtering and elements of app like functionality often.

Given that, the separation becomes more of a pointless burden than actually doing anything useful.


That was back when XML was supposed to take over the world. We were going to style the same semantic content for print and web with CSS and/or XSLT. XHTML was the next big thing. Every web page was actually just a page.

We're not living in that future.

But the CSS ecosystem is still useful, and still adapting to our needs today. Tailwind might be considered one of those adaptations.


The sad thing is, the one good thing to come out of that swamp (xpath) isn’t used nearly widely enough.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: