JitterEntropy timestamp collection) as well as the attached user-space program. This occurs both when calling rdtsc() from inside the kernel (i.e. I've tried different optimization options on the GNU C compiler to make sure there wasn't some obvious optimizations that might impact this. I'm running on various versions of the linux kernel including 4.4.12, 4.4.111, 5.4.72 as well as macOS Big Sur. However, there are some chips that return both even and odd values (least significant bit is set): I've found that the RDTSC instruction only ever returns even values (least significant bit is never set) on several of the chips I have access to. Please redirect me if I'm in the wrong place. I didn't see an obvious "forum location" for this question so I'm starting here. The Hardware processor forum wouldn't take this question because I'm developing software.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |