15
submitted 8 months ago* (last edited 7 months ago) by tusker@monero.town to c/monero@monero.town

I noticed that the Monero chain compresses about 60%, would it be possible to compress blocks before sending them from a remote node to a syncing wallet thus saving a big chunk of bandwidth and time?

Does anyone know if this is already happening during sync, or if not why?

edit: this can be done using ssh tunnels, if you have ssh access to your remote server. The "-C" option enables compression.

ssh -C -NL 18089:localhost:18089 server_username@server_address

Now you can set your wallet to 127.0.0.1:18089 and now your wallet syncing should be faster, enjoy!

you are viewing a single comment's thread
view the rest of the comments
[-] mister_monster@monero.town 6 points 8 months ago

So you've got 2 components to sync time, bandwidth and processing. In Monero we already have to attempt to decrypt transactions in each block to see if it's ours. This is what really takes time with regard to syncing.

If you compressed blocks, you'd save some bandwidth, but you'd take time client side to uncompress before sync. This adds to sync time. A user with high processing power using a node with low bandwidth might see a benefit, but for most people the bottleneck isn't bandwidth, it's processing power. Most people wouldn't see a sync time improvement with your proposed scheme.

[-] SummerBreeze@monero.town 1 points 8 months ago

How come the fees and which node (public or not) made such a difference if the issue was processing power? The bottleneck?

[-] mister_monster@monero.town 2 points 8 months ago* (last edited 8 months ago)

Well so the bandwidth of the remote node is a potential bottleneck, as well as the bandwidth of the person syncing. Whichever is smaller is going to be max rate at which data is sent, ignoring the connection path of course for simplicity. It can affect the speed of sync significantly. If you've got a powerful computer that can do a ton of operations per second and check a ton of blocks for transactions, your bottleneck is going to be bandwidth. But, if we decide to compress the blocks as you get them, you can alleviate that, with the cost of decompressing the blocks and so slowing your processing of them. Compression is an NP problem, so the more you compress the blocks, the longer it takes to decompress the data, and this relationship is not linear; 10% more compression requires more than 10% more processing time to decompress. Compressing too much eats up that bandwidth benefit you're going to get and there's a point of equilibrium that's different for each node on the network, based on it's bandwidth and processing power. Obviously, we cannot compress differently based on each node, so compressing necessarily is a trade off between bandwidth and hardware capability, any compression favors low bandwidth, higher power nodes, and no compression favors higher bandwidth, lower power nodes. Further, your compression scheme cannot compress beyond certain limits without becoming lossy, so there's a practical limit even ignoring processing time. You also have to consider processing power of the remote node, since it has to compress blocks.

load more comments (3 replies)
this post was submitted on 24 Mar 2024
15 points (77.8% liked)

Monero

1690 readers
7 users here now

This is the lemmy community of Monero (XMR), a secure, private, untraceable currency that is open-source and freely available to all.

GitHub

StackExchange

Twitter

Wallets

Desktop (CLI, GUI)

Desktop (Feather)

Mac & Linux (Cake Wallet)

Web (MyMonero)

Android (Monerujo)

Android (MyMonero)

Android (Cake Wallet) / (Monero.com)

Android (Stack Wallet)

iOS (MyMonero)

iOS (Cake Wallet) / (Monero.com)

iOS (Stack Wallet)

iOS (Edge Wallet)

Instance tags for discoverability:

Monero, XMR, crypto, cryptocurrency

founded 1 year ago
MODERATORS