Главная
Study mode:
on
1
Intro
2
Storage servers have load imbalance issue
3
Solutions to mitigate the load imbalance
4
Second, balance the load between clusters
5
Natural goals on a distributed caching mechanism
6
Design Challenges of DistCache
7
Challenge #1: How to allocate the cached items?
8
Independent hashes to allocate the cached items
9
Challenge #2: How to query the cached items?
10
Theoretical Guarantee behind DistCache
11
Proof Sketch: Convert to a perfect matching problem
12
Remarks of the DistCache Analysis
13
Example Deployment Scenarios of DistCache
14
Case Study: Switch-based distributed caching
15
Implementation Overview
16
P4: Programmable Pratocol-independent Packet Processing
17
Evaluation Setup
18
Evaluation Takeaways
19
Conclusions
Description:
Explore a groundbreaking approach to load balancing in large-scale storage systems through this award-winning conference talk. Delve into DistCache, a novel distributed caching mechanism that offers provable load balancing for expansive storage infrastructures. Learn how this innovative solution co-designs cache allocation with cache topology and query routing, partitioning hot objects using independent hash functions across multiple cache layers. Discover the theoretical foundations behind DistCache, including techniques from expander graphs, network flows, and queuing theory. Examine the design challenges faced and their solutions, such as cache allocation strategies and efficient query routing. Gain insights into the implementation of DistCache, particularly in the context of switch-based caching, and understand its potential applications in various storage systems. Analyze the evaluation setup and key takeaways that demonstrate DistCache's ability to linearly scale cache throughput with the number of cache nodes. Read more

DistCache - Provable Load Balancing for Large-Scale Storage Systems with Distributed Caching

USENIX
Add to list
0:00 / 0:00