کد مقاله | کد نشریه | سال انتشار | مقاله انگلیسی | نسخه تمام متن |
---|---|---|---|---|
431547 | 688576 | 2012 | 15 صفحه PDF | دانلود رایگان |

Token Coherence is a cache coherence protocol able to simultaneously capture the best attributes of traditional protocols: low latency and scalability. However it may lose these desired features when (1) several nodes contend for the same memory block and (2) nodes write highly-shared blocks. The first situation leads to the issue of simultaneous broadcast requests which threaten the protocol scalability. The second situation results in a burst of token responses directed to the writer, which turn it into a bottleneck and increase the latency. To address these problems, we propose a switch-based packing technique able to encapsulate several messages (while in transit) into just one. Its application to the simultaneous broadcasts significantly reduces their bandwidth requirements (up to 45%). Its application to token responses lowers their transmission latency (by 70%). Thus, the packing technique decreases both the latency and coherence traffic, thereby improving system performance (about 15% of reduction in runtime).
► Broadcast traffic threatens the scalability of token-based cache coherence protocols.
► We propose a simple technique to pack coherence messages in transit at switches.
► Both the bandwidth required by broadcast traffic and network consumption diminish.
► Latency and rate of cache misses lower, thus reducing application runtime up to 15%.
► The scalability of the Token Coherence protocol significantly improves.
Journal: Journal of Parallel and Distributed Computing - Volume 72, Issue 3, March 2012, Pages 409–423