111

I can't say for sure- but, there is a good chance I might have a problem.

The main picture attached to this post, is a pair of dual bifurcation cards, each with a pair of Samsung PM963 1T enterprise NVMes.

It is going into my r730XD. Which... is getting pretty full. This will fill up the last empty PCIe slots.

But, knock on wood, My r730XD supports bifurcation! LOTS of Bifurcation.

As a result, it now has more HDDs, and NVMes then I can count.

What's the problem you ask? Well. That is just one of the many servers I have laying around here, all completely filled with NVMe and SATA SSDs....

Figured I would share. Seeing a bunch of SSDs is always a pretty sight.

And- as of two hours ago, my particular lemmy instance was migrated to these new NVMes completely transparently too.

top 38 comments
sorted by: hot top controversial new old
[-] Decronym@lemmy.decronym.xyz 47 points 1 year ago* (last edited 1 year ago)

Acronyms, initialisms, abbreviations, contractions, and other phrases which expand to something larger, that I've seen in this thread:

Fewer Letters More Letters
NVMe Non-Volatile Memory Express interface for mass storage
PCIe Peripheral Component Interconnect Express
SATA Serial AT Attachment interface for mass storage
SSD Solid State Drive mass storage

4 acronyms in this thread; the most compressed thread commented on today has 3 acronyms.

[Thread #13 for this sub, first seen 8th Aug 2023, 21:55] [FAQ] [Full list] [Contact] [Source code]

[-] steal_your_face@lemmy.ml 11 points 1 year ago
[-] 21Cabbage@lemmynsfw.com 4 points 1 year ago

Fantastic bot, honestly.

[-] 21Cabbage@lemmynsfw.com 1 points 1 year ago
[-] infinitevalence@discuss.online 15 points 1 year ago

I dont see any issues!

/me hides his 16 4tb 12g SAS drives.....

[-] brygphilomena@lemmy.world 5 points 1 year ago

I think I'm at 7x 18tb drives. I'm slowly replacing all the smaller 8tb disks in my server. Only 5 more to go. After that it's a new server with more bays and/or a jbod shelf.

[-] iesou@lemm.ee 1 points 1 year ago

That's my next step. I have 8 8tb drives I need to start swapping, 2x512 NVMEs for system/app cache, and 1 2tb NVME for media cache.

[-] Millie@lemm.ee 7 points 1 year ago

I dream of this kind of storage. I just added a second m.2 with a couple of TB on it and the space is lovely but I can already see I'll fill it sooner than I'd like.

[-] xtremeownage@lemmyonline.com 5 points 1 year ago

I will say, it's nice not having to nickel and dime my storage.

But, the way I have things configured, redundancy takes up a huge chunk of the overall storage.

I have around 10x 1T NVMe and SATA SSDs in a ceph cluster. 60% storage overhead there.

Four of those 8T disks are in a ZFS Striped Mirror / Raid 10. 50% storage overhead.

The 4x 970 evo / evo plus drives are also in a striped mirror ZFS pool. 50% overhead.

But, still PLENTY of usable storage, and- highly available at that!

Cripes I was stoked I managed to upgrade from 4x 2tb to 4x 4tb recently.

[-] maxprime@lemmy.ml 5 points 1 year ago

If that’s a problem then I don’t want to be solved.

[-] xtremeownage@lemmyonline.com 2 points 1 year ago

Its only a problem when you get the electric bill! (Or the wife finds your ebay receipts)

[-] I_Miss_Daniel@kbin.social 3 points 1 year ago

I doubt these use much power compared to their spinning rust anticedents.

[-] xtremeownage@lemmyonline.com 2 points 1 year ago

I meant my general electric bill. My server room averages 500-700watts.

[-] steeev@midwest.social 2 points 1 year ago

Was curious how many watts this machine pulls? Also curious if you had ever filled it will spinning disks - would flash be less power hungry?

[-] joel@aussie.zone 4 points 1 year ago

Love this. Apart from hosting an instance, what are you using it for? Self-cloud?

[-] xtremeownage@lemmyonline.com 4 points 1 year ago

I host a few handfuls of websites, some discord bots.

I hoard Linux isos. I use it for general purpose learning and experimentation.

There is also kubernetes running, source control, and a bit of everything else.

Is your problem that you are bragging about your drives?

[-] xtremeownage@lemmyonline.com 1 points 1 year ago

I'm out of room to add more drives!

Every one of my servers is basically completely full on disks. I need more servers.

[-] scottmeme@sh.itjust.works 2 points 1 year ago

Do you happen to have a link to those cards?

[-] xtremeownage@lemmyonline.com 4 points 1 year ago* (last edited 1 year ago)

Dual Slot Bifurcation Card Those are the ones I just picked up.

If you have a x16 slot, and can fit a full-height card, and use 4x4x4x4 bifurcation, the ASUS Hyper M.2 is really good.

[-] scottmeme@sh.itjust.works 2 points 1 year ago

Sweet!

I've got a gen3 hyper M.2 but I was looking for something for the 8x slots in one of my servers without needing full height cards.

[-] xtremeownage@lemmyonline.com 1 points 1 year ago

That's the exact use case I got these for

[-] SirNuke@kbin.social 1 points 1 year ago

Do you have any trouble with cooling or anything with them? Got like a billion unused PCIe lanes in my Dell R730 and can think of a few things that might benefit from a big NVMe ZFS pool.

[-] xtremeownage@lemmyonline.com 1 points 1 year ago

Generally, no.

I run a custom fan control script which keeps the fans around 30% minimum, but increases if needed.

Below 30% some things were getting toasty

[-] webuge@lemmy.dbzer0.com 2 points 1 year ago

Well this seems to be a good problem to have hahah. If you need to get rid of some of those ssds count with me.

[-] xtremeownage@lemmyonline.com 4 points 1 year ago

ebay! You can pick up these "used" enterprise NVMe and SSDs for CHEAP. All 10 arrived with less than 5% wear.

[-] webuge@lemmy.dbzer0.com 1 points 1 year ago

Good to know I will take a look thank you.

[-] krolden@lemmy.ml 1 points 1 year ago

Having a large flash pool really makes your life so much better.

Until you fill up all your space and have to buy more :p

[-] xtremeownage@lemmyonline.com 1 points 1 year ago

Hopefully that doesn't happen soon! I don't have too much room for more flash, lol.

But, I have quite a bit of available space, so, there shouldn't be any concerns. Also- tomorrow, after a few adapters arrives, I'll be adding another 2x 1T flash drives my Optiplex 5060 SFF.

What software are you running on all of this?

[-] xtremeownage@lemmyonline.com 0 points 1 year ago

From https://lemmyonline.com/comment/768355

A bit of everything. Publicly facing websites. Lemmyonline.com. A few popular discord bots.

Linux ISO collection and streaming.

Lots of automation.

Lots of things around software development. Lots of things around systems and network administration.

Some kubernetes too.

A bit of everything, and nothing in particular.

What management interface is that though and is it part of the OS? What OS are you using anyway?

[-] xtremeownage@lemmyonline.com 1 points 1 year ago

The bottom screenshot, is from proxmox, which is the top-level OS in play.

[-] greybeard@lemmy.one 0 points 1 year ago

The first screenshot is of Dells built in system tools for servers. Being a Dell server he should have Dell's iDRAC, which is a lights out management module. It is really fantastic.

I wasn't talking about that. I was talking about the second screenshot. Thanks anyway

The only problem I see is using 8x slots instead of 16x slots for double the storage

[-] xtremeownage@lemmyonline.com 1 points 1 year ago* (last edited 1 year ago)

Whats the problem?

Each NVMe uses 4 lanes. For each of these x8 slots, they have two NVMes, for a total of 8 lanes.

The x16 slot already has 4x NVMe in it, lol. The other x16 slot has a GPU, which is located in that particular slot due to the lovely 3d-printed fan shroud.

One of the other full-height x8 slots also has a PLX switch, and is loaded with 4 more NVMes.

load more comments
view more: next ›
this post was submitted on 08 Aug 2023
111 points (95.9% liked)

Selfhosted

39677 readers
415 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS