You are not logged in.
Pages: 1
Hi,
I have recently come across a rather annoying mathematical problem. I am currently making a computer program that will scale text to fit inside a 5 pixels wide by 7 pixels high box. (Pixels are dots on the screen)
At first it seemed easy since I was able to get the width and height of the character. For example if the characters size was 7 by 8 I could simply do this
5/7 = x (0.71428571428571428571428571428571)
7/8 = y (0.875)
Then I just say I want the image scaled by x and y which should give the character back as 5 by 7
When that wasnt working I saw that there was an invisible box round the character that was used instead by the computer so its perceived width and height was wrong.
For example if the width counted by me is 7
And the computer thinks its 9 (No way to change this)
And I want the character to fit inside 5
I can no longer do 5/7 because it will be multiplied by 9 and I cant do 5/9 because that will make the invisible box 5 and the actual width of the character counted by me to be smaller than 5
What maths can I use so that I can fit the character inside the 5 by 7 grid?
Thanks for your time
Offline
You might want to learn to make your own font with these specs and then distribute the font with your software.
igloo myrtilles fourmis
Offline
I think John's solution is the best. Even if you did manage to scale them correctly, the characters would look blurred and possibly be illegible.
Why did the vector cross the road?
It wanted to be normal.
Offline
Pages: 1