Both of these software show the exact same gpu core and memory clocks, just in a slightly different way. On Nvidia, clocks are set in steps as opposed to AMD were you can change a single Mhz.
Firstly, for the core: on one software is 1380 while on the other it is 1379, the true clock is probably 1380- this is the current gpu core clock . The 1253/1404 Boost clock is what you gpu is set at after ocing but as you notice, for now it is maintaining 1380, not the maximum value of 1404 (perfectly fine, depends on temps).
Secondly, the memory: 3505Mhz/7012Mhz (some monitoring variations as no monitoring software is perfect) is the exact same clock speed (one is showing the true value, the other one is doubled), like in the case of DDR ram: For example you have a DDR4 kit running at 3000mhz. It will show only 1500mhz in CPU-Z because it shows the value on only one side of the DIMM.