Abstract: In this paper, we developed a high-level simulation model for DRAM controllers to capture such features as burst alignment, scheduling policies, etc. The model is based on SystemC/TLM for ESL platform integration. Compared to a commercial RTL implementation, the model has a worst-case error of 4.5%. Due to its fast simulation speed, we then apply the model to demonstrate two design trade-offs. The results show that it's possible to improve 10.9% throughput if we share command queues among ports. And average read latency can have up to 10.7% improvement over an increase of write latency (tolerable with more write buffers) by tuning read/write switching policy.
Loading