MDIO analyser

Hi.
We using logic8 pro to debug MDIO inteface with RTL8367RB.


I see that when data is sourced by PHY, the analyser is sampling by falling edge.
I tried to find how MDIO implemented in CPUs.
For example ST provides this diagram:

On the other hand TI draw something like this:

My question is what documentation was used when MDIO analyser was developed?

The MDIO analyzer source is on github, and I found the following issue reported:

The Saleae support article for MDIO analyzer references Wikipedia, which refers to the IEEE 802.3 standard(s)?

As noted in the github issue, the MDIO analyzer decoding could be incorrect if an MDC falling edge comes too fast (before MDIO transitions). However, I think it should work fine as long as each MDIO transition always beats the MDC falling edge (i.e., running at slower MDC clock speeds and/or PHY w/ shorter MDIO data delays).

MDIO References:

Were you having decoding problems, or just curious about how the MDIO analyzer was designed? What is your MDC clock speed for PHY? What is the MDIO data delay for PHY (from MDC rising edge to MDIO transition)? Are the MDIO transitions happening after MDC falling edges?

Also, what digital sample rate and threshold levels are you using? Have you looked at the analog signals, too? In this case, using the max/full analog bandwidth of 50 MS/s is recommended to see any subtle signal characteristics. However, if your MDC clock is running too fast and/or MDIO data delay is just too long, you may need to wait until the known issue above is resolved by Saleae (or provide a pull request on github if you can fix it yourself :wink:).

Hello.
Thank you for the reply.
I think I got it, documentation specifies timing for sourcing MDIO, not for sampling (unlike I2C).
Our problem is that analyzer result differ from what hardware reads. Offse is 1 bit (i.e. 0x3000 and 0x6000). Our bigger problem, that we don’t know what is the right answer for RTL8367RB :grinning:
We have been controlling analog signals with separate oscilloscope, levels are OK (3.3V) and logic8pro configured accordingly.
We will try to slow down the bus.

I was curious about how MDIO bus was supposed to work vs. the Realtek implementation, and found the following timing diagram when googling for a RTL8367RB datasheet:

… and the following timing characteristics:

The strange thing is that the MDC to MDIO Delay Time (SYM: t4) was specified as [0 - 40] ns (typically 2.8 ns) after the falling edge of MDC.

However, searching for another MDIO specification, I found an example from Microchip (KSZ9131RNX):
image

… and the Microchip device has these timing characteristics:

Notice how Microchip’s t[val] parameter is 80 ns max, measured from rising [edge] of MDC (rather than falling edge specified above on the Realtek device)?

I’m not an expert on MDIO bus interface and don’t have IEEE 802.3 specifications handy, but it seems like the Realtek part might have designed the MDIO output characteristics in a manner that doesn’t strictly comply with the IEEE specification.
I base this assumption on an IEEE spec. quote provided in the NXP community post (linked previously) that states (bold added for emphasis):

According to IEEE 802.3: “When the MDIO signal is sourced by the PHY, it is sampled by the STA synchronously with respect to the rising edge of MDC. The clock to output delay from the PHY, as measured at the MII connector, shall be a minimum of 0 ns, and a maximum of 300 ns…”

If this understanding is correct, then it seems like the current Saleae MDIO analyzer won’t support the Realtek device’s signal behavior ‘as is’ – since it is sampling exactly on MDC falling edges (apparently assuming that the falling edge will be sufficiently delayed from MDC rising edge to meet the IEEE 802.3 specification of 300 ns maximum delay, or at least what a given PHY device actually requires – as Microchip part only requires an 80 ns maximum delay for MDIO to be ready).

Thus, it looks like a custom modification (or future update) of the MDIO analyzer would be needed, like:

  • Delaying the MDIO sample point to a user-defined delay after MDC falling edge to better sync with Realtek’s implementation
  • Implement a user-defined delay after MDC rising edge for a more IEEE 802.3 compliant behavior (vs. using the falling edge reference point), and any non-compliant implementations could just pad some extra delay timing to account for any MDC high pulse timing variations

Out of curiosity, did you have any luck slowing down the MDC clock speed?

For everyone’s information, all the IEEE 802 (Ethernet, WiFi) specifications are freely available from Browse Standards | Get Program | IEEE Xplore

See https://www.ieee802.org/ for (a lot) more information.

In my 2015 edition of the 802.3 specification, the MDIO functionality is specified in section 22.2.4 Management functions, with timing in section 22.3.4 MDIO timing relationship to MDC. When the PHY is sourcing the MDIO signal, it must follow Figure 22-19, MDIO source by PHY.
Positive edge of MDC to data stable must be 0 ns min, 300ns max. So Realtek doesn’t sound like they are following the standard.

1 Like

Hi.
We captured waveform with slow MDC (~400 kHz).
You are right, RTL8367RB drives MDIO line on falling edge (50-180 ns after).


Session MDIO Read Reg 1300.sal (3.9 KB)