My question is as far as I see closely related to this post.
I need to plot some data with marker size strictly proportional to the value of the axes. (Already asked the question here).
My approach is the following:
- Create empty scatterplot for pixel reference
- Scatter 2 points on the lower left and upper right corner
- Limit the axes to exactly these two points
- use
transData.transform
to get the pixel values of those two points - get the pixel distance as the difference in pixel number of these two points
- (now that I have the distance-to-pixel-ratio, scatter my data with
s=(size*dist_to_pix_ratio)**2
; but that is not important right now.)
Problem is: when I do exactly what I've described, I get two different values for the pixel number for the y-axis and the x-axis.
Here is a minimal code:
import matplotlib.pyplot as plt
fig = plt.figure(figsize=(7,7))
ax1 = fig.add_subplot(111, aspect='equal')
#setting up an empty scatterplot for pixel reference
xedges=[0.0, 1.0]
yedges=[0.0, 1.0]
emptyscatter=ax1.scatter(xedges, yedges, s=0.0)
#set axes limits
ax1.set_xlim(0.00,1.00)
ax1.set_ylim(0.00,1.00)
# Calculating the ratio of pixel-to-unit
upright = ax1.transData.transform((1.0,1.0))
lowleft = ax1.transData.transform((0.0,0.0))
x_to_pix_ratio = upright[0] - lowleft[0]
y_to_pix_ratio = upright[1] - lowleft[1]
print x_to_pix_ratio, y_to_pix_ratio
which returns:
434.0 448.0
Can anybody explain why they are not equal?
I'm not sure if it's relevant, but I'm using Python 2.7.12 and matplotlib 1.5.1