262
submitted 1 week ago* (last edited 1 week ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world

Hello world,

as many of you probably already know, Lemmy is an open source project and its development is funded by donations.

Unfortunately, as is often the case, donations amounts are often going down over time if people are not aware of their necessity. When older users leave the platform they may stop donating, while new users joining will typically not be aware of this and won't start donating to even things out or even go towards an overall increase in donations.

All of the services provided by our non-profit Fedihosting Foundation are dependent on the development of FOSS platforms, which we can host without paying any licensing or other fees, instead only being required to pay for the infrastructure cost. We are currently investing a small part (€50 each) of the donations we receive in development of Lemmy and Mastodon, but the majority of the donations we receive are used for covering infrastructure costs. We're currently just about breaking even with the donations we receive, but it's certainly not enough to cover a large part of Lemmy or other software development costs.

We're looking to support sustainable software development for all the services we provide and will post similar announcements on our other platforms to promote donations towards the respective development teams in the coming days.

You can find the original announcement by @nutomic@lemmy.ml below:

cross-posted from: https://lemmy.ml/post/29579005

An open source project the size of Lemmy needs constant work to manage the project, implement new features and fix bugs. Dessalines and I work full-time on these tasks and more. As there is no advertising or tracking, all of our work is funded through donations. Unfortunately the amount of donations has decreased to only 2000€ per month. This leaves only 1000€ per developer, which is not enough to pay my bills. With the current level of donations I will be forced to find another job, and drastically reduce my contributions to Lemmy. To avoid this outcome and keep Lemmy growing, I ask you to please make a recurring donation:

Liberapay | Ko-fi | Patreon | OpenCollective | Crypto

If you want more information before donating, consider the comparison with Reddit. It began as startup funded by rich investors. The site is managed by corporate executives who over time have become more and more disconnected from normal users. Their main goal is to make investors happy and to make a profit. This leads to user-hostile decisions like firing the employee responsible for AMAs, blocking third-party apps and more. As Reddit is a single website under a single authority, it means all users need to follow the same rules, including ridiculous ones like censoring the name "Luigi".

Lemmy represents a new type of social media which is the complete opposite of Reddit. It is split across many different websites, each with its own rules, and managed by normal people who actually care about the users. There is no company and no profit motive. Much of the work is carried out by volunteer admins, mods and posters, who contribute out of enthusiasm and not for money. For users this is great as there is no advertising nor tracking, and no chance of takeover by a billionaire. Additionally there are no builtin political or ideological restrictions. You can use the software for any purpose you like, add your own restrictions or scrutinize its inner workings. Lemmy truly belongs to everyone.

Dessalines and I work fulltime on Lemmy to keep up with all the feature requests, bug reports and development work. Even so there is barely enough time in the day, and no time for a second job. Previously I sometimes had to rely on my personal savings to keep developing Lemmy for you, but that can't go on forever. We partly rely on NLnet for funding, but they only pay for development of new features, and not for mandatory maintenance work. The only available option are user donations. To keep it viable donations need to reach a minimum of 5000€ per month, resulting in a modest salary of 2500€ per developer. If that goal is reached Dessalines and I can stop worrying about money, and fully focus on improving the software for the benefit of all users and instances. Please use the link below to see current donation stats and make your contribution! We especially rely on recurring donations to secure the long-term development and make Lemmy the best it can be.

Donate


edit, as this was frequently brought up:

Will donations to Lemmy development go towards the operation of lemmy.ml?

It depends on the donation method used and is limited to around 2% of the minimum overall donation goal. The vast majority of donations is exclusively used for developer salaries.

lemmy.ml hosting is only financed by donations via Opencollective. All other donations go exclusively to developer salaries.

[source]

For donations via Open Collective, yes, a tiny fraction of donations towards Lemmy development will go towards the operation of lemmy.ml. The reasons for this include that lemmy.ml is used for testing new releases and also that it's not worth maintaining a separate donation account for the instance. Additionally, it should be noted that the money going towards lemmy.ml hosting is just a tiny fraction of the funds that are being asked for. Hosting lemmy.ml costs around €100/month, which is only 2% of the stated minimum donation goal.

2
submitted 1 week ago* (last edited 1 week ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world

Hello world,

we will be performing an update to Lemmy 0.19.11 in an hour.

We are planning for around 15 minutes of downtime today at 21:30-21:45 UTC.

You can convert this to your local time here: https://inmytime.zone/?iso=2025-05-03T21%3A30%3A00.000Z

As mentioned in our April update, we had already backported most of the Lemmy-UI changes for 0.19.11, but we were still missing most of the backend changes.

This update will bring us the remaining changes listed in the release notes, as well as some additional changes not yet released:

  • user registrations are now processed in a DB transaction, which prevents some errors we've occasionally seen in the past where registrations rarely resulted in an inconsistent user creation state (#5480)
  • new posts in NSFW communities are now always marked NSFW (#5310)
  • another round of peertube federation fixes (#5652)
  • fixed email notifications for denied applications (#5641)
    this was already supposed to be part of 0.19.11 but it did not work there. we are migrating our previous external email notification implementation to let Lemmy handle sending emails now.
  • various fixes for opentelemetry that are not present in upstream Lemmy
    As Lemmy opentelemetry has been removed from Lemmy 1.0 this is not code that we are currently planning to upstream to the 0.19 branch, but we intend to keep using this going forward.

If you are an instance admin considering enabling opentelemetry in Lemmy 0.19, don't do that unless you also apply a similar set of patches to bring the related libraries to newer versions, as your instance will otherwise lock up after some requests.


Update 21:50 UTC: The update has been completed successfully and within the planned amount of time.

53

Hello,

as this is a fairly active community we just wanted to let you know that this community is no longer federating with Lemmy.World due to defederation from lemmy.one for lack of moderation.

Our announcement can be found here: https://lemmy.world/post/28173093

We recommend migrating to a community on an instance that is maintained better.

181
submitted 1 month ago* (last edited 1 month ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world

Hello world,

we've had various smaller updates lately that didn't all warrant their own update posts and we don't want to post too many announcements around the same time, so we collected them for a single larger post.

New alternative Lemmy frontend: Tesseract

We have recently added a new alternative Lemmy frontend to our collection: Tesseract.

Tesseract was forked from Photon a while back and includes a variety of additional customization options and moderation utilities.

It is available at https://t.lemmy.world/.

Lemmy-UI update to 0.19.11-ish

We have deployed a custom build of Lemmy-UI, the default Lemmy web interface, which includes most of the features included in the official 0.19.11 release.

While we haven't updated our backend to a newer version yet, as we still have to find a solution for dealing with the newly integrated functionality to send emails on rejected registration applications, all the frontend features that don't require a backend update have been included. The only part that is currently missing is Lemmy's new donation dialog, as this requires a backend upgrade as well.

You can find the list of changes in the frontend section in the announcement for the 0.19.11 release.

Defederation from lemmy.one and r.nf

A Lemmy.World user informed us about an instance we are federated with that was hosting very illegal content a while back. This was a result of an attack more than a year ago, and said content federated to many other instances, which made local copies of the material. Unfortunately, when this material was taken down at the source, that action did not federate to all linked instances, meaning that there are still some instances showing this material.

Once we were made aware of this, we realized that this was likely not the only occurrence, so we started looking for other instances where this content may also still exist. We have identified more than 50 affected instances and already reached out to many of them to inform them about this content to have it taken down.

Among these instances, r.nf and lemmy.one were some of the first instances that were informed, but even after 2 months since the initial report there has been zero reaction from either instance. Both of these instances don't appear to be moderated, as evident also by posts asking whether the instance is still maintaned on lemmy.one and 2 month old spam in r.nf's main community.

The community that gets hit the hardest by this is !privacyguides@lemmy.one, which is the only larger community across these instances. We recommend looking for alternative communities on other instances.

Due to the lack of action and response we have since also reported this directly to their hosting providers through Cloudflare, which includes an automatic report to NCMEC.

Even when this material will get taken down now, we don't currently believe that the instance operators are willing or able to moderate these instances properly, so we will keep them defederated unless they can convince us that they are going to moderate their instances more actively and ensure that they provide usable abuse contacts that don't require going through their hosting provider.

We also defederate from other instances from time to time due to lack of moderation and unreachable admins among other reasons. If you're interested in the reasons for our defederations, we aim to always document them on Fediseer. Be warned though, as this list contains a mentions or references to various disturbing or illegal material.

Most of those instances are either very small, don't interact with Lemmy much anyway, or are explicitly stating support of content that is incompatible with our policies.

We also usually try to reach out to affected instances prior to defederation if we believe that they may not intentionally be supporting the problematic content.

We have temporarily re-federated to lemmy.one to allow this post and https://lemmy.world/post/28173100 to federate to them. We're waiting for federation to catch up with the activities since we defederated a day ago originally before we defederate again.

Reliability of media uploads

We have recently been receiving some reports of media uploads not working from time to time. We have already addressed one of the underlying issues and are working on addressing another one currently. Please continue to let us know about issues like that to ensure that they're on our radar.

We're currently also working on improving our overall application monitoring to collect more useful information that helps us track down specific issues, improve visibility for errors, as well as hopefully allowing us to identify potential performance issues.

Parallel federation

Back in Lemmy 0.19.6, Lemmy introduced the option to send federated activities in parallel. Without this, Lemmy would only ever have one activity in the process of being transmitted to another instance. While most instances don't have a large number of activities going out, we're at the point where instances far away from us are not able to keep up with our traffic anymore due to physics limitations when waiting for data from the other side of the world.

Some instances mitigated this by setting up an external federation queue near our instance that would batch activities together to work around these limitations while this was not implemented in Lemmy and deployed on our end. Unfortunately this also meant having to maintain an additional server, which means time investment, a few bucks every month to pay, as well as another potential component that could break.

We have enabled 2 parallel sends around a week ago and aussie.zone, who were pretty much constantly lagging behind multiple days have finally caught up with us again. We will continue to monitor this and if needed increase the number of parallel sends further in the future, but so far it looks like we should be fine with 2 for a good while.


edit: added section about parallel federation

71
submitted 1 month ago* (last edited 1 month ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world

Hello world,

as many of you may already be aware, there is an ongoing spam attack by a person claiming to be Nicole.

It is very likely that these images are part of a larger scale harassment campaign against the person depicted in the images shared as part of this spam.

Although the spammer claims to be the person in the picture, we strongly believe that this is not the case and that they're only trying to frame them.

Starting immediately, we will remove any images depicting "Nicole" and information that may lead to identifying the real person depicted in those images to prevent any possible harassment.
This includes older posts and comments once identified.

We also expect moderators to take action if such content is reported.

While we do not intend to punish people posting this once, not being aware of the context, we may take additional actions if they continue to post this content, as we consider this to be supporting the harassment campaign.

Discussion that does not include the images themselves or references that may lead to identifying the real person behind the image will continue to be allowed.

If you receive spam PMs please continue reporting them and we'll continue working on our spam detections to attempt to identify them early before they reach many users.

2
submitted 1 month ago* (last edited 1 month ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world

Hello,

we will be updating pict-rs to the latest version in about 2 hours.

We expect a short downtime of 1-2 minutes during the planned migration window, as there are no major database changes involved.

Most users won't be affected by this, as the majority of our media is cached and served by Cloudflare. This should primarily only affect thumbnail generation and uploads of new media while the service is down.

You can convert this to your local time here: https://inmytime.zone/?iso=2025-03-28T22%3A00%3A00.000Z


The update has been completed successfully.

4

Hello,

as some of you may have noticed we just had about 25 minutes of downtime due to the update to Lemmy 0.19.10.

Lemmy release notes: https://join-lemmy.org/news/2025-03-19_-_Lemmy_Release_v0.19.10_and_Developer_AMA

This won't fix YouTube thumbnails for us, as YouTube banned all IPs belonging to our hosting provider.

We were intending to apply this update without downtime, as we're looking to apply the database migration that allows marking PMs as removed due to the recent spam waves.

Although this update contains database migrations, we expected to still be able to apply the migration in the background before updating the running software, as the database schema between the versions was backwards compatible. Unfortunately, once we started the migrations, we started seeing the site go down.

In the first minutes we assumed that the migrations contained in this upgrade were somehow unexpectedly blocking more than intended but still processing, but it turned out that nothing was actually happening on the database side. Our database deadlocked due to what appears to be an orphaned transaction, which didn't die even after we killed all Lemmy containers other than the one running the migrations.

While the orphaned transaction was pending, a pending schema migration was waiting for the previous transaction to complete or be rolled back, so nothing was moving anymore. As the previous transaction also didn't move anymore everything started to die. We're not entirely sure why the original transaction broke down, as it was started about 30 seconds before the schema migration query, which seems like that shouldn't have been broken by that happening at the same time.

Lemmy has a "replaceable" schema, which is applied separately from the regular database schema migrations, which runs every time a DB migration occurs. We unfortunately did not consider this replaceable schema migration in our planning, as we would otherwise have realized that this would likely have larger impact on the overall migration.

After we identified that the database had deadlocked, we resorted to restarting our postgres container, then run the migration again. Once we restarted the database, everything was back online in less than 30 seconds, which includes first running the remaining migrations and then starting up all containers again.

When we tested this process on our test instance prior to deploying this to the Lemmy.World production environment we did not run into this issue. Everything was working fine with the backend services running on Lemmy 0.19.9 and the database being upgraded to Lemmy 0.19.10 schema already, but the major difference here is the lack of user activity during the time of the migration.

Our learning from this is to always plan for downtime for Lemmy updates if any database migrations are included, as it does not appear to be possible to "safely" apply them even if they seem small enough to be theoretically doable without downtime.

17
submitted 2 months ago* (last edited 2 months ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world

Hello,

we will be performing the long awaited update to Lemmy 0.19.9 tomorrow.

We are planning for around 1 hour of downtime between 16:00-17:00 UTC on 16th of March.

You can convert this to your local time here: https://inmytime.zone/?iso=2025-03-16T16%3A00%3A00.000Z

You can find an overview of the changes in our previous announcement here and in the Lemmy release notes:


Update 16:50 UTC:

The upgrade was successfully completed at around 16:27 UTC, but we're still fighting with some performance issues after the upgrade. Our database and the outbound federation container are currently using significantly higher CPU than expected, which is still being investigated to identify the root cause.

3
submitted 2 months ago* (last edited 2 months ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world

Hello,

while preparation our upcoming Lemmy update we're also be updating to a newer database version, which will provide additional performance benefits and functionality.

We are planning for around 10 minutes of downtime for the database update between 19:00-19:30 UTC on 22nd of February. This is not yet the planned Lemmy update, we will be announcing that separately when we are ready.

edit: You can convert this to your local time here: https://inmytime.zone/?iso=2025-02-22T19%3A00%3A00.000Z

edit 2: The database upgrade has been completed successfully.

348
submitted 4 months ago* (last edited 4 months ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world

Hello World,

as many of you know, several newer Lemmy versions have been released since the once we are currently using.

As this is a rather long post, the TLDR is that we're currently planning for late January/early February to update Lemmy.World to a newer Lemmy release.

We're currently running Lemmy 0.19.3 with a couple patches on top to address some security or functionality issues.

As new Lemmy versions have been released, we've been keeping an eye on other instances' experiences with the newer versions, as well as tracking certain issues on GitHub, which might impact stability or moderation experience.

We updated to Lemmy 0.19.3 back in March this year. At that point, 0.19.3 had been released for a little over a month already and at that point all the major issues that troubled the earlier 0.19 releases had been addressed.

Several months later, in June, Lemmy 0.19.4 was released with several new features. This was a rather big release, as a lot of changes had happened since the last release. Only 12 days later 0.19.5 was released, which fixed a few important issues with the 0.19.4 release. Unfortunately, Lemmy 0.19.5 also introduced some changes that were, and to some part are still not fully addressed.

Prior to Lemmy 0.19.4, regular users may see contents of removed or deleted comments in some situations, primarily when using third party apps. Ideally, this would have been fixed by restricting access to contents of removed comments to community moderators in the communities they moderate, as well as admins on each instance. Deleted comments will be overwritten in the database after some delay, but they might still be visible prior to that. This is especially a problem when moderators want to review previously removed comments to either potentially restore them or to understand context in a thread with multiple removed comments. Lemmy modlog does not always record individual modlog entries for bulk-removed items, such as banning a user while also removing their content would only log their ban but not the individual posts or comments that were removed.

We were considering writing a patch to restore this functionality for moderators in their communities, but this is unfortunately a rather complex task, which also explains why this isn't a core Lemmy feature yet.

While admins can currently filter modlog for actions by a specific moderator, this functionality was lost somewhere in 0.19.4. While this isn't something our admin team is using very frequently, it is still an important feature to have available for us for the times we need it.

This also included a few security changes for ActivityPub handling, which resulted in breaking the ability to find e.g. Mastodon posts in Lemmy communities by entering the post URL in the search. It also caused issues with changes to communities by remote moderators.

The 0.19.4 release also broke marking posts as read in Sync for Lemmy. Although this isn't really something we consider a blocker, it's still worth mentioning, as there are still a lot of Sync for Lemmy users out there that haven't noticed this issue yet if they're only active on Lemmy.World. Over the last 2 weeks we've had nearly 5k active Sync for Lemmy users . This is unfortunately something that will break during the upgrade, as the API has changed in upstream Lemmy.

There are also additional issues with viewing comments on posts in local communities that appear to be related to the 0.19.4/0.19.5 release, appear to be a lot more serious. There have been various reports of posts showing with zero comments in Sync, while viewing them in a browser or another client will show various comments. It's not entirely clear to us right now what the full impact is and to what extent it can be mitigated by user actions, such as subscribing to communities. If anyone wants to research what is needed to restore compatibility and potentially even propose a patch for compatibility with both the updated and the previous API version we'll consider applying it as a custom patch on top of the regular Lemmy release.

If there won't be a Sync update in time for our update and we won't have a viable workaround available, you may want to check out !lemmyapps@lemmy.world to find potential alternatives.

There were also several instances reporting performance issues after their upgrades, although they seemed to mostly have been only for a relatively short time after the upgrades and not persistent.

Lemmy 0.19.6 ended up getting released in November and introduced quite a few bug fixes and changes again, including filtering the modlog by moderator. Due to a bug breaking some DB queries, 0.19.7 was released just 7 days later to address that.

Among the issues fixed in this release were being able to resolve Mastodon URLs in the search again and remote moderators being able to update communities again.

0.19.6 also changed the way post thumbnails generated, which resulted thumbnails missing on various posts.

A month later, now we're in December, 0.19.8 was released.

One of the issues addressed by 0.19.8 was Lemmy returning content of removed comments again for admins. For community moderators this functionality is not yet restored due to the complexity of having to check mod status in every community present in the comment listing.

At this point it seems that most of the issues have been addressed, although there seem to still be some remaining issues relating to thumbnails not reliably being created in some cases. We'll keep an eye on any updates on that topic to see if it might be worth waiting a little longer for another fix or possibly deploying an additional patch even if it may not be part of an official Lemmy release yet at the time.

While we were backporting some security/stability related changes, including a fix for a bug that can break federation in some circumstances when a community is removed, we accidentally reverted this patch while applying another backport, which resulted in our federation with lemmy.ml breaking back in November. This issue was already addressed upstream a while back, so other instances running more recent Lemmy versions were not affected by this.

Among the new features released in the Lemmy versions we have missed out on so far, here are a couple highlights:

  • Users will be able to see and delete their uploads on their profile. This will include all uploads since we updated to 0.19.3, which is the Lemmy version that started tracking which user uploaded media.
  • Several improvements to federation code, which improve compatibility with wordpress, discourse, nodebb.
  • Fixing signed fetch for federation, enabling federation with instances that require linked instances to authenticate themselves when fetching remote resources. Not having this is something we've seen cause issues with a small number of mastodon instances that require this.
  • Site bans will automatically issue community bans, which means they're more reliable to federate.
  • Deleted and removed posts and comments will no longer show up in search results.
  • Bot replies and mentions will no longer be included in notification counts when a user has blocked all bots.
  • Saved posts and comments will now be returned in the reverse order of saving them rather than the reverse order of them being created.
  • The image proxying feature has evolved to a more mature state. This feature intends to improve user privacy by reducing requests to third party websites when browsing Lemmy. We do not currently plan on enabling it with the update, but we will evaluate it later on.
  • Local only communities. We don't currently see a good use for these, as they will prevent federation of such communities. This cuts off users on all other instances, so we don't recommend using them unless you really want that.
  • Parallel sending of federated activities to other instances. This can be especially useful for instances on the other side of the world, where latency introduces serious bottlenecks when only sending one activity at a time. A few instances have already been using intermediate software to batch activities together, which is not standard ActivityPub behavior, but it allows them to eliminate most of the delays introduced by latency. This mostly affects instances in Australia and New Zealand, but we've also seen federation delays with instances in US from time to time. This will likely not be enabled immediately after the upgrade, but we're planning to enable this shortly after.

edit: added information about sync not showing comments on posts in local communities

105
submitted 4 months ago by lwadmin@lemmy.world to c/lemmyworld@lemmy.world

Hello World,

today, @db0@lemmy.dbzer0.com has provided an update to the media upload scanner we're using. This should reduce the amount of false positives blocked from being uploaded. We have deployed the updated version now.

While we do not have stats about false positives from before we implemented the scan when uploading, those changes did not change the overall data availability for us. Flagged images were still deleted, they were just still served by our cache in many cases. By moving this to the upload process, it has become much more effective, as previously images could persist in Cloudflare's cache for extended periods of time, while now they won't get cached in the first place.

Over the last week, we've seen a rate of roughly 6.7% uploads rejected out of around 3,000 total uploads. We'll be able to compare numbers in a week to confirm that this has indeed improved the false positive rate.

156
submitted 5 months ago by lwadmin@lemmy.world to c/lemmyworld@lemmy.world

Hello World,

following feedback we have received in the last few days, both from users and moderators, we are making some changes to clarify our ToS.

Before we get to the changes, we want to remind everyone that we are not a (US) free speech instance. We are not located in US, which means different laws apply. As written in our ToS, we're primarily subject to Dutch, Finnish and German laws. Additionally, it is our discretion to further limit discussion that we don't consider tolerable. There are plenty other websites out there hosted in US and promoting free speech on their platform. You should be aware that even free speech in US does not cover true threats of violence.

Having said that, we have seen a lot of comments removed referring to our ToS, which were not explicitly intended to be covered by our ToS. After discussion with some of our moderators we have determined there to be both an issue with the ambiguity of our ToS to some extent, but also lack of clarity on what we expect from our moderators.

We want to clarify that, when moderators believe certain parts of our ToS do not appropriately cover a specific situation, they are welcome to bring these issues up with our admin team for review, escalating the issue without taking action themselves when in doubt. We also allow for moderator discretion in a lot of cases, as we generally don't review each individual report or moderator action unless they're specifically brought to admin attention. This also means that content that may be permitted by ToS can at the same time be violating community rules and therefore result in moderator action. We have added a new section to our ToS to clarify what we expect from moderators.

We are generally aiming to avoid content organizing, glorifying or suggesting to harm people or animals, but we are limiting the scope of our ToS to build the minimum framework inside which we all can have discussions, leaving a broader area for moderators to decide what is and isn't allowed in the communities they oversee. We trust the moderators judgement and in cases where we see a gross disagreement between moderatos and admins' criteria we can have a conversation and reach an agreement, as in many cases the decision is case-specific and context matters.

We have previously asked moderators to remove content relating to jury nullification when this was suggested in context of murder or other violent crimes. Following a discussion in our team we want to clarify that we are no longer requesting moderators to remove content relating to jury nullification in the context of violent crimes when the crime in question already happened. We will still consider suggestions of jury nullification for crimes that have not (yet) happened as advocation for violence, which is violating our terms of service.

As always, if you stumble across content that appears to be violating our site or community rules, please use Lemmys report functionality. Especially when threads are very active, moderators will not be able to go through every single comment for review. Reporting content and providing accurate reasons for reports will help moderators deal with problematic content in a reasonable amount of time.

[-] lwadmin@lemmy.world 60 points 8 months ago

We will be releasing a separate post involving that incident in the next 24-48 hours, just getting final approval from the team.

[-] lwadmin@lemmy.world 139 points 2 years ago

This is a volunteer platform, and as such no one is paid. Applicants may include their availability info and be considered accordingly.

[-] lwadmin@lemmy.world 43 points 2 years ago* (last edited 2 years ago)

No, even when the option comes for users to block whole instances we will still defederate with instances we do not want to moderate the content from. But we also always reserve the right to re-federate with any instance if the concerns are resolved.

And as per https://lemmy.world/legal : We are not a free speech zone. This Code of Conduct lays out the expected standards of conduct and behavior. Users may not say or post anything that violates these rules, and all participants are required to follow this code. If you disagree with this code, you are welcome to keep looking for other Lemmy instances. Here’s a list of all public instances.

[-] lwadmin@lemmy.world 115 points 2 years ago

A tolerable level we can handle by moderation. And when even the admins join in it becomes clear there is a big incompatibility and cultural difference.

But you probably meant something else, right?

[-] lwadmin@lemmy.world 91 points 2 years ago

Most of their communities were blocked since months, that's why you didn't see much of them.

[-] lwadmin@lemmy.world 57 points 2 years ago

We are well aware of what's going on with kbin and the development team. That's why we don't defederate because we have hope that they will fix things soon.

[-] lwadmin@lemmy.world 34 points 2 years ago

I heard it's like Discord

[-] lwadmin@lemmy.world 209 points 2 years ago

@Striker@lemmy.world this is not your fault. You stepped up when we asked you to and actively reached out for help getting the community moderated. But even with extra moderators this can not be stopped. Lemmy needs better moderation tools.

[-] lwadmin@lemmy.world 117 points 2 years ago

Thank you for the kindness!

[-] lwadmin@lemmy.world 80 points 2 years ago* (last edited 2 years ago)

You need to hover over the status bar to see if there is any down time for that day. We can enable it to log incidents every time there is a burp, but we are still tuning alerts as we only have it create a incident when we ACK it in PagerDuty. You can always check the dashboard for up to the minute stats, as well as https://lemmy-status.org/endpoints/_lemmy-world We'll add this info to make things clearer <3

EDIT: Added more info to our status page, thanks for the feedback Machefi!

EDIT2: Also the missing data is due to us removing and adding more specific monitors for the different infra services.

[-] lwadmin@lemmy.world 125 points 2 years ago* (last edited 2 years ago)

This was a misunderstanding from one of the team members. It has since been discussed and will not happen again. Lemmy.World and this announcement community is our primary platform,

[-] lwadmin@lemmy.world 77 points 2 years ago

Doesn't matter if they are hosted here or not. The way federation works is that threads on different instances are cached locally.

We have NO issues with the people at db0 - we are just looking out for ourselves in a 'better safe than sorry' fashion while we find out more. As mentioned in the OP we would like to unblock as soon as we know we can not get in any legal trouble.

view more: next ›

lwadmin

joined 2 years ago
MODERATOR OF