DPI and screen resolution on OS X

classic Classic list List threaded Threaded
17 messages Options
Chris Sherlock Chris Sherlock
Reply | Threaded
Open this post in threaded view
|

DPI and screen resolution on OS X

Hi all,

I’ve mentioned this briefly to Tor on IRC, but thought I’d email the mailing list and a general enquiry.

I noticed that we don’t actually get the “true” DPI for OS X, nor the actual resolution - at least on high resolution screens (Retina in particular).

That’s because Apple have a concept of logical points, where each point scales depending on the resolution of the screen. To get the actual resolution, you need to get the NSScreen’s backing coordinate system into an NSRect, then get the actual resolution - at which point you can calculate the PPI (I’ll use PPI from now on, it’s more accurate than DPI).

I figured out how to get this out of OS X and I submitted to patches to Gerrit for review:

1. https://gerrit.libreoffice.org/#/c/21948/ - vcl: (quartz) get the actual DPI on OS X
2. https://gerrit.libreoffice.org/#/c/21973/ - vcl: (quartz) get the actual pixel height and width of OS X

It’s actually pretty simple - you just use:

NSRect aFrame = [pScreen convertRectToBacking:[pScreen frame]];

This gives the actual resolution in pixels, not logical points (backing coordinates are always in pixels).

However, whilst everything renders correctly - the size of things obviously double (or change to whatever scale the screen resolution makes it).

I guess I was wondering what the impact is, or what challenges have we had with not getting the *actual* DPI and screen resolution on OS X builds? Has anyone noticed anything odd when they are using a Mac, or developing for it?

The other question is: why would we not want to the actual DPI and screen resolution?

Thanks,
Chris

_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
Kohei Yoshida-6 Kohei Yoshida-6
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X

On Wed, 2016-02-03 at 10:52 +1100, Chris Sherlock wrote:
> The other question is: why would we not want to the actual DPI and
> screen resolution?

My understanding is that, historically, the OS provided a function to
query DPI but what gets returned from such function was not always
accurate (or always not accurate depending on who you ask).  So, the
workaround at the time was to assume that DPI is always 96 (and
hard-code that value) regardless of what the OS told you, which worked
just fine because the monitors used back in the day had the same screen
resolution.

I'm not sure if that's a non-issue today.  I don't know enough about
this topic to tell you that with confidence.

Kohei

_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
SOS SOS
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X


On 3/02/2016 3:55, Kohei Yoshida wrote:

> On Wed, 2016-02-03 at 10:52 +1100, Chris Sherlock wrote:
>> The other question is: why would we not want to the actual DPI and
>> screen resolution?
> My understanding is that, historically, the OS provided a function to
> query DPI but what gets returned from such function was not always
> accurate (or always not accurate depending on who you ask).  So, the
> workaround at the time was to assume that DPI is always 96 (and
> hard-code that value) regardless of what the OS told you, which worked
> just fine because the monitors used back in the day had the same screen
> resolution.
Mostly DPI is found in the header of a pixelfile (taken by camera).
Unfortunately it's not the photographer who gets to decide about the
needed DPI.
DPI is actually a wrong definition for documents, Dots Per Inch is a
definition used by output devices. Screens need a PIXEL par DOT but for
print devices there is no precise correlation between the number of dots
used by the device and the pixels needed in  the image for having a
maximum image-view quality.
The print industry has come to some standards by trial and error.
- monitor screens need 96 - (220-retina) pixels per inch
- laser printers need 150 pixels per inch (up tot 2000 + dots)
- offset printers need 254 -300 pixels per inch (up to 3000 dots)

For a document we must use Pixels Per Inch which are calculated
regarding the DPI needed by the final output device and represented in
each document by  a "Print Intention" .
When producing docs for printing on an office laser printer we need less
Pixels Per Inch than docs (magazines , books) which are printed on
offset machines.
When an image is loaded, then the system can calculate the viewing size
using the number of pixels needed by the "Print Intention". The user can
then see the maximum size the image can have in his document without
losing image  quality.

Hope it helps

Fernand

>
> I'm not sure if that's a non-issue today.  I don't know enough about
> this topic to tell you that with confidence.
>
> Kohei
>
> _______________________________________________
> LibreOffice mailing list
> [hidden email]
> http://lists.freedesktop.org/mailman/listinfo/libreoffice

_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
Chris Sherlock Chris Sherlock
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X

On 3 Feb 2016, at 7:24 PM, SOS <[hidden email]> wrote:

>
>
> On 3/02/2016 3:55, Kohei Yoshida wrote:
>> On Wed, 2016-02-03 at 10:52 +1100, Chris Sherlock wrote:
>>> The other question is: why would we not want to the actual DPI and
>>> screen resolution?
>> My understanding is that, historically, the OS provided a function to
>> query DPI but what gets returned from such function was not always
>> accurate (or always not accurate depending on who you ask).  So, the
>> workaround at the time was to assume that DPI is always 96 (and
>> hard-code that value) regardless of what the OS told you, which worked
>> just fine because the monitors used back in the day had the same screen
>> resolution.
> Mostly DPI is found in the header of a pixelfile (taken by camera). Unfortunately it's not the photographer who gets to decide about the needed DPI.
> DPI is actually a wrong definition for documents, Dots Per Inch is a definition used by output devices. Screens need a PIXEL par DOT but for print devices there is no precise correlation between the number of dots used by the device and the pixels needed in  the image for having a maximum image-view quality.
> The print industry has come to some standards by trial and error.
> - monitor screens need 96 - (220-retina) pixels per inch
> - laser printers need 150 pixels per inch (up tot 2000 + dots)
> - offset printers need 254 -300 pixels per inch (up to 3000 dots)

Definitely true :-) Only in OS X’s case, it doesn’t actually report back the correct resolution unless you ask for the backing coordinate system.

The PPI business is a red herring I think I’ve introduced into this discussion I’m afraid. We calculate the PPI ourselves (and call it DPI) based on the reported pixels, and the size of the screen in mm (which we obviously convert to inches).

I guess I’m curious as to what is relying on the screen resolution and PPI.

Although… it’s funny that we have the function SalGraphics::GetResolution, but that returns the PPI!

Chris
_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
SOS SOS
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X


On 3/02/2016 11:32, Chris Sherlock wrote:

> On 3 Feb 2016, at 7:24 PM, SOS <[hidden email]> wrote:
>>
>> On 3/02/2016 3:55, Kohei Yoshida wrote:
>>> On Wed, 2016-02-03 at 10:52 +1100, Chris Sherlock wrote:
>>>> The other question is: why would we not want to the actual DPI and
>>>> screen resolution?
>>> My understanding is that, historically, the OS provided a function to
>>> query DPI but what gets returned from such function was not always
>>> accurate (or always not accurate depending on who you ask).  So, the
>>> workaround at the time was to assume that DPI is always 96 (and
>>> hard-code that value) regardless of what the OS told you, which worked
>>> just fine because the monitors used back in the day had the same screen
>>> resolution.
>> Mostly DPI is found in the header of a pixelfile (taken by camera). Unfortunately it's not the photographer who gets to decide about the needed DPI.
>> DPI is actually a wrong definition for documents, Dots Per Inch is a definition used by output devices. Screens need a PIXEL par DOT but for print devices there is no precise correlation between the number of dots used by the device and the pixels needed in  the image for having a maximum image-view quality.
>> The print industry has come to some standards by trial and error.
>> - monitor screens need 96 - (220-retina) pixels per inch
>> - laser printers need 150 pixels per inch (up tot 2000 + dots)
>> - offset printers need 254 -300 pixels per inch (up to 3000 dots)
> Definitely true :-) Only in OS X’s case, it doesn’t actually report back the correct resolution unless you ask for the backing coordinate system.
>
> The PPI business is a red herring I think I’ve introduced into this discussion I’m afraid. We calculate the PPI ourselves (and call it DPI) based on the reported pixels, and the size of the screen in mm (which we obviously convert to inches).
its a bit the wrong discussion: what we see on screen has no relevance:
the user can "zoom" the document until he is happy with the image
quality on screen
But in the current situation, LO users has no idea how big (size) he can
place a image in a document.
When the doc is intented for online use (email and Web) then there is a
minimum of 96 pixels par inch needed. More is no problem but is in many
cases a overkill.
Who is editing a "book" or a "magazine" need minimal 254 pixels par inch
to has a good image quality after printing.
When using less pixels the book pages  are looking fine on screen put
shall have a creepy print quality
So having a new "DocumentProperty" indicating the needed pixels (for
printing)  make it possible to make the "size" calculations before
inserting.

>
> I guess I’m curious as to what is relying on the screen resolution and PPI.
>
> Although… it’s funny that we have the function SalGraphics::GetResolution, but that returns the PPI!
>
> Chris
> _______________________________________________
> LibreOffice mailing list
> [hidden email]
> http://lists.freedesktop.org/mailman/listinfo/libreoffice

_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
Chris Sherlock Chris Sherlock
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X

On 3 Feb 2016, at 10:35 PM, SOS <[hidden email]> wrote:

>
>
> On 3/02/2016 11:32, Chris Sherlock wrote:
>> On 3 Feb 2016, at 7:24 PM, SOS <[hidden email]> wrote:
>>>
>>> On 3/02/2016 3:55, Kohei Yoshida wrote:
>>>> On Wed, 2016-02-03 at 10:52 +1100, Chris Sherlock wrote:
>>>>> The other question is: why would we not want to the actual DPI and
>>>>> screen resolution?
>>>> My understanding is that, historically, the OS provided a function to
>>>> query DPI but what gets returned from such function was not always
>>>> accurate (or always not accurate depending on who you ask).  So, the
>>>> workaround at the time was to assume that DPI is always 96 (and
>>>> hard-code that value) regardless of what the OS told you, which worked
>>>> just fine because the monitors used back in the day had the same screen
>>>> resolution.
>>> Mostly DPI is found in the header of a pixelfile (taken by camera). Unfortunately it's not the photographer who gets to decide about the needed DPI.
>>> DPI is actually a wrong definition for documents, Dots Per Inch is a definition used by output devices. Screens need a PIXEL par DOT but for print devices there is no precise correlation between the number of dots used by the device and the pixels needed in  the image for having a maximum image-view quality.
>>> The print industry has come to some standards by trial and error.
>>> - monitor screens need 96 - (220-retina) pixels per inch
>>> - laser printers need 150 pixels per inch (up tot 2000 + dots)
>>> - offset printers need 254 -300 pixels per inch (up to 3000 dots)
>> Definitely true :-) Only in OS X’s case, it doesn’t actually report back the correct resolution unless you ask for the backing coordinate system.
>>
>> The PPI business is a red herring I think I’ve introduced into this discussion I’m afraid. We calculate the PPI ourselves (and call it DPI) based on the reported pixels, and the size of the screen in mm (which we obviously convert to inches).
> its a bit the wrong discussion: what we see on screen has no relevance: the user can "zoom" the document until he is happy with the image quality on screen
> But in the current situation, LO users has no idea how big (size) he can place a image in a document.
> When the doc is intented for online use (email and Web) then there is a minimum of 96 pixels par inch needed. More is no problem but is in many cases a overkill.
> Who is editing a "book" or a "magazine" need minimal 254 pixels par inch to has a good image quality after printing.
> When using less pixels the book pages  are looking fine on screen put shall have a creepy print quality
> So having a new "DocumentProperty" indicating the needed pixels (for printing)  make it possible to make the "size" calculations before inserting.

We are actually detecting the PPI… with the greatest of respect, I’ve actually implemented some testing changes to detect the correct PPI and on my Mac is should actually be just over 200PPI…

I think this is going in the wrong direction. I worked for Epson about 13 years ago, so I have some knowledge of printing :-) I could talk your ear off on colour management and halftoning, and I probably know a bit too much about piezo-electric crystal technology…

I’m really trying to understand what is relying on the resolution and what sort of impact fixing the resolution detection might be having on OS X systems.

Chris
_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
Alex Thurgood Alex Thurgood
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X

In reply to this post by Chris Sherlock
Le 03/02/2016 00:52, Chris Sherlock a écrit :

Hi Chris,


> Hi all,
>
> I’ve mentioned this briefly to Tor on IRC, but thought I’d email the mailing list and a general enquiry.
>
> I noticed that we don’t actually get the “true” DPI for OS X, nor the actual resolution - at least on high resolution screens (Retina in particular).
>

From what I recall this has provided a few calculation issues in the
past with some of the xml layout unit tests in sw on OSX - making rect
calculation rounding errors spring to mind causing the unit test to
fail. As I only reported these at the time, and not involved in fixing
them, I can't say I understand the details.

I also wonder whether the horrible black or white bands and other
display detritus we see on OSX in fullscreen would be affected by
changes in this area (for the better hopefully)

Alex


_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
SOS SOS
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X

In reply to this post by Chris Sherlock

On 3/02/2016 13:01, Chris Sherlock wrote:

> On 3 Feb 2016, at 10:35 PM, SOS <[hidden email]> wrote:
>>
>> On 3/02/2016 11:32, Chris Sherlock wrote:
>>> On 3 Feb 2016, at 7:24 PM, SOS <[hidden email]> wrote:
>>>> On 3/02/2016 3:55, Kohei Yoshida wrote:
>>>>> On Wed, 2016-02-03 at 10:52 +1100, Chris Sherlock wrote:
>>>>>> The other question is: why would we not want to the actual DPI and
>>>>>> screen resolution?
>>>>> My understanding is that, historically, the OS provided a function to
>>>>> query DPI but what gets returned from such function was not always
>>>>> accurate (or always not accurate depending on who you ask).  So, the
>>>>> workaround at the time was to assume that DPI is always 96 (and
>>>>> hard-code that value) regardless of what the OS told you, which worked
>>>>> just fine because the monitors used back in the day had the same screen
>>>>> resolution.
>>>> Mostly DPI is found in the header of a pixelfile (taken by camera). Unfortunately it's not the photographer who gets to decide about the needed DPI.
>>>> DPI is actually a wrong definition for documents, Dots Per Inch is a definition used by output devices. Screens need a PIXEL par DOT but for print devices there is no precise correlation between the number of dots used by the device and the pixels needed in  the image for having a maximum image-view quality.
>>>> The print industry has come to some standards by trial and error.
>>>> - monitor screens need 96 - (220-retina) pixels per inch
>>>> - laser printers need 150 pixels per inch (up tot 2000 + dots)
>>>> - offset printers need 254 -300 pixels per inch (up to 3000 dots)
>>> Definitely true :-) Only in OS X’s case, it doesn’t actually report back the correct resolution unless you ask for the backing coordinate system.
>>>
>>> The PPI business is a red herring I think I’ve introduced into this discussion I’m afraid. We calculate the PPI ourselves (and call it DPI) based on the reported pixels, and the size of the screen in mm (which we obviously convert to inches).
>> its a bit the wrong discussion: what we see on screen has no relevance: the user can "zoom" the document until he is happy with the image quality on screen
>> But in the current situation, LO users has no idea how big (size) he can place a image in a document.
>> When the doc is intented for online use (email and Web) then there is a minimum of 96 pixels par inch needed. More is no problem but is in many cases a overkill.
>> Who is editing a "book" or a "magazine" need minimal 254 pixels par inch to has a good image quality after printing.
>> When using less pixels the book pages  are looking fine on screen put shall have a creepy print quality
>> So having a new "DocumentProperty" indicating the needed pixels (for printing)  make it possible to make the "size" calculations before inserting.
> We are actually detecting the PPI… with the greatest of respect, I’ve actually implemented some testing changes to detect the correct PPI and on my Mac is should actually be just over 200PPI…
ok but how did you detected the PPI ?
You need the "fysical" dots given by the manufactor off the screen. Then
you cab devide the total dots by the size off the screen
Standard Led and LCD screens have around 100 dots par inch
A Retina screen on a tablet or a smartphone could have 150-300 dots par
inch


>
> I think this is going in the wrong direction. I worked for Epson about 13 years ago, so I have some knowledge of printing :-) I could talk your ear off on colour management and halftoning, and I probably know a bit too much about piezo-electric crystal technology…
>
> I’m really trying to understand what is relying on the resolution and what sort of impact fixing the resolution detection might be having on OS X systems.
>
> Chris

_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
Chris Sherlock Chris Sherlock
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X

In reply to this post by Alex Thurgood

> On 3 Feb 2016, at 11:08 PM, Alexander Thurgood <[hidden email]> wrote:
>
> Le 03/02/2016 00:52, Chris Sherlock a écrit :
>
> Hi Chris,
>
>
>> Hi all,
>>
>> I’ve mentioned this briefly to Tor on IRC, but thought I’d email the mailing list and a general enquiry.
>>
>> I noticed that we don’t actually get the “true” DPI for OS X, nor the actual resolution - at least on high resolution screens (Retina in particular).
>>
>
> From what I recall this has provided a few calculation issues in the
> past with some of the xml layout unit tests in sw on OSX - making rect
> calculation rounding errors spring to mind causing the unit test to
> fail. As I only reported these at the time, and not involved in fixing
> them, I can't say I understand the details.
>
> I also wonder whether the horrible black or white bands and other
> display detritus we see on OSX in fullscreen would be affected by
> changes in this area (for the better hopefully)
>
> Alex

If nobody has any objections, I can add an environment variable that forces it to pick up the actual screen resolution.

Something like OSX_FORCEBACKINGCOORDS

If you then want to troubleshoot this sort of thing and wonder if this might be helpful, then set the environment variable and reload LO, see if the same issue occurs.

Chris
_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
Armin Le Grand-2 Armin Le Grand-2
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X

In reply to this post by SOS
Hi,

comments inline

Am 03.02.2016 um 12:35 schrieb SOS:

>
> On 3/02/2016 11:32, Chris Sherlock wrote:
>> On 3 Feb 2016, at 7:24 PM, SOS <[hidden email]> wrote:
>>>
>>> On 3/02/2016 3:55, Kohei Yoshida wrote:
>>>> On Wed, 2016-02-03 at 10:52 +1100, Chris Sherlock wrote:
>>>>> The other question is: why would we not want to the actual DPI and
>>>>> screen resolution?
>>>> My understanding is that, historically, the OS provided a function to
>>>> query DPI but what gets returned from such function was not always
>>>> accurate (or always not accurate depending on who you ask). So, the
>>>> workaround at the time was to assume that DPI is always 96 (and
>>>> hard-code that value) regardless of what the OS told you, which worked
>>>> just fine because the monitors used back in the day had the same
>>>> screen
>>>> resolution.
>>> Mostly DPI is found in the header of a pixelfile (taken by camera).
>>> Unfortunately it's not the photographer who gets to decide about the
>>> needed DPI.
>>> DPI is actually a wrong definition for documents, Dots Per Inch is a
>>> definition used by output devices. Screens need a PIXEL par DOT but
>>> for print devices there is no precise correlation between the number
>>> of dots used by the device and the pixels needed in  the image for
>>> having a maximum image-view quality.
>>> The print industry has come to some standards by trial and error.
>>> - monitor screens need 96 - (220-retina) pixels per inch
>>> - laser printers need 150 pixels per inch (up tot 2000 + dots)
>>> - offset printers need 254 -300 pixels per inch (up to 3000 dots)
>> Definitely true :-) Only in OS X’s case, it doesn’t actually report
>> back the correct resolution unless you ask for the backing coordinate
>> system.
>>
>> The PPI business is a red herring I think I’ve introduced into this
>> discussion I’m afraid. We calculate the PPI ourselves (and call it
>> DPI) based on the reported pixels, and the size of the screen in mm
>> (which we obviously convert to inches).
> its a bit the wrong discussion: what we see on screen has no
> relevance: the user can "zoom" the document until he is happy with the
> image quality on screen
> But in the current situation, LO users has no idea how big (size) he
> can place a image in a document.
> When the doc is intented for online use (email and Web) then there is
> a minimum of 96 pixels par inch needed. More is no problem but is in
> many cases a overkill.
> Who is editing a "book" or a "magazine" need minimal 254 pixels par
> inch to has a good image quality after printing.
> When using less pixels the book pages  are looking fine on screen put
> shall have a creepy print quality
> So having a new "DocumentProperty" indicating the needed pixels (for
> printing)  make it possible to make the "size" calculations before
> inserting.

It is relevant. If you have a vector graphic and it gets converted to
bitmap, the DPI from the system is used to define the resulting pixel
size. Conversion to bitmap happens more often than it might seem. Examples:
- user chooses to do so (context menu, convert to bitmap)
- some exporters who are not capable using vector graphics
- PDF, e.g. PDF/1A which is not allowed to use transprencies and solves
by creating bitmaps where graphics and transparent parts overlap
- 3D renderer which targets to bitmaps (chart, 3D objects)
Thus, the system DPI is essential. If on Mac, the bigger DPI will be
used, it will enlarge all these conversions.

HTH!

>>
>> I guess I’m curious as to what is relying on the screen resolution
>> and PPI.
>>
>> Although… it’s funny that we have the function
>> SalGraphics::GetResolution, but that returns the PPI!
>>
>> Chris
>> _______________________________________________
>> LibreOffice mailing list
>> [hidden email]
>> http://lists.freedesktop.org/mailman/listinfo/libreoffice
>
> _______________________________________________
> LibreOffice mailing list
> [hidden email]
> http://lists.freedesktop.org/mailman/listinfo/libreoffice

--
--
ALG (PGP Key: EE1C 4B3F E751 D8BC C485 DEC1 3C59 F953 D81C F4A2)

_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
SOS SOS
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X

hallo Armin
new comments inline on your remarks
On 3/02/2016 14:10, Armin Le Grand wrote:

> Hi,
>
> comments inline
>
> Am 03.02.2016 um 12:35 schrieb SOS:
>>
>> On 3/02/2016 11:32, Chris Sherlock wrote:
>>> On 3 Feb 2016, at 7:24 PM, SOS <[hidden email]> wrote:
>>>>
>>>> On 3/02/2016 3:55, Kohei Yoshida wrote:
>>>>> On Wed, 2016-02-03 at 10:52 +1100, Chris Sherlock wrote:
>>>>>> The other question is: why would we not want to the actual DPI and
>>>>>> screen resolution?
>>>>> My understanding is that, historically, the OS provided a function to
>>>>> query DPI but what gets returned from such function was not always
>>>>> accurate (or always not accurate depending on who you ask). So, the
>>>>> workaround at the time was to assume that DPI is always 96 (and
>>>>> hard-code that value) regardless of what the OS told you, which
>>>>> worked
>>>>> just fine because the monitors used back in the day had the same
>>>>> screen
>>>>> resolution.
>>>> Mostly DPI is found in the header of a pixelfile (taken by camera).
>>>> Unfortunately it's not the photographer who gets to decide about
>>>> the needed DPI.
>>>> DPI is actually a wrong definition for documents, Dots Per Inch is
>>>> a definition used by output devices. Screens need a PIXEL par DOT
>>>> but for print devices there is no precise correlation between the
>>>> number of dots used by the device and the pixels needed in  the
>>>> image for having a maximum image-view quality.
>>>> The print industry has come to some standards by trial and error.
>>>> - monitor screens need 96 - (220-retina) pixels per inch
>>>> - laser printers need 150 pixels per inch (up tot 2000 + dots)
>>>> - offset printers need 254 -300 pixels per inch (up to 3000 dots)
>>> Definitely true :-) Only in OS X’s case, it doesn’t actually report
>>> back the correct resolution unless you ask for the backing
>>> coordinate system.
>>>
>>> The PPI business is a red herring I think I’ve introduced into this
>>> discussion I’m afraid. We calculate the PPI ourselves (and call it
>>> DPI) based on the reported pixels, and the size of the screen in mm
>>> (which we obviously convert to inches).
>> its a bit the wrong discussion: what we see on screen has no
>> relevance: the user can "zoom" the document until he is happy with
>> the image quality on screen
>> But in the current situation, LO users has no idea how big (size) he
>> can place a image in a document.
>> When the doc is intented for online use (email and Web) then there is
>> a minimum of 96 pixels par inch needed. More is no problem but is in
>> many cases a overkill.
>> Who is editing a "book" or a "magazine" need minimal 254 pixels par
>> inch to has a good image quality after printing.
>> When using less pixels the book pages  are looking fine on screen put
>> shall have a creepy print quality
>> So having a new "DocumentProperty" indicating the needed pixels (for
>> printing)  make it possible to make the "size" calculations before
>> inserting.
>
> It is relevant. If you have a vector graphic and it gets converted to
> bitmap, the DPI from the system is used to define the resulting pixel
> size. Conversion to bitmap happens more often than it might seem.
> Examples:
> - user chooses to do so (context menu, convert to bitmap)
> - some exporters who are not capable using vector graphics
> - PDF, e.g. PDF/1A which is not allowed to use transprencies and
> solves by creating bitmaps where graphics and transparent parts overlap
> - 3D renderer which targets to bitmaps (chart, 3D objects)
> Thus, the system DPI is essential. If on Mac, the bigger DPI will be
> used, it will enlarge all these conversions.
Thats the problem withn this "system DPI"
for screen viewing is 96 DPI OK  but far to less when the document needs
to be printed
we need a replacement for the system DPI who is a value who must can
differ par document

>
> HTH!
>
>>>
>>> I guess I’m curious as to what is relying on the screen resolution
>>> and PPI.
>>>
>>> Although… it’s funny that we have the function
>>> SalGraphics::GetResolution, but that returns the PPI!
>>>
>>> Chris
>>> _______________________________________________
>>> LibreOffice mailing list
>>> [hidden email]
>>> http://lists.freedesktop.org/mailman/listinfo/libreoffice
>>
>> _______________________________________________
>> LibreOffice mailing list
>> [hidden email]
>> http://lists.freedesktop.org/mailman/listinfo/libreoffice
>

_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
Eike Rathke-2 Eike Rathke-2
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X

In reply to this post by Chris Sherlock
Hi Chris,

On Wednesday, 2016-02-03 23:27:22 +1100, Chris Sherlock wrote:

> If nobody has any objections, I can add an environment variable that forces it to pick up the actual screen resolution.
>
> Something like OSX_FORCEBACKINGCOORDS

Please prefix with LIBO_..., so LIBO_OSX_FORCEBACKINGCOORDS

  Eike

--
LibreOffice Calc developer. Number formatter stricken i18n transpositionizer.
GPG key "ID" 0x65632D3A - 2265 D7F3 A7B0 95CC 3918  630B 6A6C D5B7 6563 2D3A
Better use 64-bit 0x6A6CD5B765632D3A here is why: https://evil32.com/
Care about Free Software, support the FSFE https://fsfe.org/support/?erack

_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice

signature.asc (836 bytes) Download Attachment
Armin Le Grand-2 Armin Le Grand-2
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X

In reply to this post by SOS
Hi SOS,

Am 03.02.2016 um 16:46 schrieb SOS:

>
>> It is relevant. If you have a vector graphic and it gets converted to
>> bitmap, the DPI from the system is used to define the resulting pixel
>> size. Conversion to bitmap happens more often than it might seem.
>> Examples:
>> - user chooses to do so (context menu, convert to bitmap)
>> - some exporters who are not capable using vector graphics
>> - PDF, e.g. PDF/1A which is not allowed to use transprencies and
>> solves by creating bitmaps where graphics and transparent parts overlap
>> - 3D renderer which targets to bitmaps (chart, 3D objects)
>> Thus, the system DPI is essential. If on Mac, the bigger DPI will be
>> used, it will enlarge all these conversions.
> Thats the problem withn this "system DPI"
> for screen viewing is 96 DPI OK  but far to less when the document
> needs to be printed
> we need a replacement for the system DPI who is a value who must can
> differ par document

+1, DPI to use should be available at the target device, be on a
appropritate value (96 for display, 300 for print, whatever, similar for
PDF export, and obviously something over 200 for mac display).

At best case in no way should display DPI be used to permanently change
model data and write it back to the file. No idea how to do that best.

Example:
You have painted an ellipse (vector data). User chooses 'convert to
Bitmap' and saves the file. On Mac you will have a >200 DPI bitmap, on
all other systems a 96 DPI one. If you do the change on a non-mac and
load later at mac, it may be visibly 'pixelated' on the display.

What to do?

Not allow user to convert to bitmap?
This is not an option, the user's will should always be respected.

Use a very high DPI always (the highest currently known (what is already
guessing), so mac one).
Also not an option, will make files much bigger (ressource need of
bitmap raises quadratic)

Just mark the graphic as 'being a Bitmap', but save the vector data. At
reload, re-create bitmap data.
Way to complicated, would give small file, needs re-creation of bitmap
at load time

Hmmm...

>>
>> HTH! [hidden email]
> http://lists.freedesktop.org/mailman/listinfo/libreoffice

--
ALG (PGP Key: EE1C 4B3F E751 D8BC C485 DEC1 3C59 F953 D81C F4A2)

_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
Norbert Thiebaud Norbert Thiebaud
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X

On Thu, Feb 4, 2016 at 10:10 AM, Armin Le Grand <[hidden email]> wrote:
>
> Example:
> You have painted an ellipse (vector data). User chooses 'convert to Bitmap'
> and saves the file. On Mac you will have a >200 DPI bitmap,

Just as a reminder. the 'DPI' on mac depend dynamically on the
'Screen' on which you are drawing at that point in time.
you could have multiple screen, with different 'DPI', and they can
even change by system setting (although not supported by the
'standard' system setting gui interface, it _is_ possible to configure
the OS so that a retina screen is used with 1 point = 1 pixel (rather
than the 1 point = 4 pixel you usually have on retina) (*)

Ideally DPI 'awareness' you really not leak outside of vcl.. and even
in vcl be limited to place where we are in 'PhysicalCoordinate'

Norbert

(*) one of the issue is that our bitmap abstraction does not make the
distinction between point and pixel, which makes the page preview
thumbnails in impress for example, not able to get as crisp as they
could be on retina display.
_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
Chris Sherlock Chris Sherlock
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X

On 5 Feb 2016, at 6:37 AM, Norbert Thiebaud <[hidden email]> wrote:

>
> On Thu, Feb 4, 2016 at 10:10 AM, Armin Le Grand <[hidden email]> wrote:
>>
>> Example:
>> You have painted an ellipse (vector data). User chooses 'convert to Bitmap'
>> and saves the file. On Mac you will have a >200 DPI bitmap,
>
> Just as a reminder. the 'DPI' on mac depend dynamically on the
> 'Screen' on which you are drawing at that point in time.
> you could have multiple screen, with different 'DPI', and they can
> even change by system setting (although not supported by the
> 'standard' system setting gui interface, it _is_ possible to configure
> the OS so that a retina screen is used with 1 point = 1 pixel (rather
> than the 1 point = 4 pixel you usually have on retina) (*)
>
> Ideally DPI 'awareness' you really not leak outside of vcl.. and even
> in vcl be limited to place where we are in 'PhysicalCoordinate'
>
> Norbert
>
> (*) one of the issue is that our bitmap abstraction does not make the
> distinction between point and pixel, which makes the page preview
> thumbnails in impress for example, not able to get as crisp as they
> could be on retina display.

There were a series of patches that handled hi-DPI displays in 2014 that Keith did for us and that were pushed by Kendy:

> I've just pushed a backport of the hi-dpi patches from master to gerrit
> for libreoffice-4-2 integration - as was requested earlier, to fix the
> unfortunate state of LibreOffice on the hi-dpi displays.  It is the
> following 5 patches (order is important):
>
> https://gerrit.libreoffice.org/#/c/8516/
> https://gerrit.libreoffice.org/#/c/8517/
> https://gerrit.libreoffice.org/#/c/8518/
> https://gerrit.libreoffice.org/#/c/8519/
> https://gerrit.libreoffice.org/#/c/8520/
>
> Keith confirmed that they fix the hi-dpi issues he was seeing in
> LibreOffice 4.2.
>
> They are supposed to be safe for normal displays; that is anything
> non-safe should be enclosed in an "if (mnDPIScaleFactor > 1)".  Few
> cases make the computation a bit more general, like:
>
> +    long yOffset = (aRect.GetHeight() - mpImpl->maImage.GetSizePixel().Height()) / 2;
> +
>      if( mpImpl->mnState == SIGNATURESTATE_SIGNATURES_OK )
>      {
> -        ++aRect.Top();
> +        aRect.Top() += yOffset;

I’m wondering if this is the area I should focus on.

I’m not entirely sure how the scaling factor is being worked out, we seem to do this in Window::ImplInit and Window::ImplInitResolutionSettings with the following calculation:

mnDPIScaleFactor = std::max(1, (mpWindowImpl->mpFrameData->mnDPIY + 48) / 96);

Does anyone know what the underlying theory is behind this calculation? 96 seems to be a hardcoded DPI value assumed for all screens, but I can’t quite work out where the 48 number comes from…

I’m also wondering if it might not be better for us to move this calculation out of Window and into SalGraphics, given it is the SalGraphics backend that really gives the DPI via GetResolution.

Another thing is: we seem to have this idea of logical coordinates, as opposed to device coordinates all through OutputDevice, and also there is a way of setting the OutputDevice mapmode. I’ve never quite understood what the idea behind this is. Can anyone give me any insights into this?

Chris

P.S. I’ve just checked my Mac and the default scaling option is indeed lower than what I was expecting - the default on my Mac with a Retina screen is 2560x1440.

Hint for OS X developers: to change actual screen resolutions, you need to:

1. Go to System Preferences
2. Go into the Display panel
3. In the Display tab, hold down the option key on your keyboard and click on the Scaled radio option

This will give you the chance to set the actual screen resolution, as opposed to the more limited graphical option that OS X gives you.



_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
Tomaž Vajngerl Tomaž Vajngerl
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X

Hi,

On Fri, Feb 5, 2016 at 2:02 AM, Chris Sherlock
<[hidden email]> wrote:
> There were a series of patches that handled hi-DPI displays in 2014 that Keith did for us and that were pushed by Kendy:

Yes, and I continued later on with the patches when I got a laptop
with HiDPI screen and running Fedora.

>> I've just pushed a backport of the hi-dpi patches from master to gerrit
>> for libreoffice-4-2 integration - as was requested earlier, to fix the
>> unfortunate state of LibreOffice on the hi-dpi displays.  It is the
>> following 5 patches (order is important):
>>
>> https://gerrit.libreoffice.org/#/c/8516/
>> https://gerrit.libreoffice.org/#/c/8517/
>> https://gerrit.libreoffice.org/#/c/8518/
>> https://gerrit.libreoffice.org/#/c/8519/
>> https://gerrit.libreoffice.org/#/c/8520/
>>
>> Keith confirmed that they fix the hi-dpi issues he was seeing in
>> LibreOffice 4.2.
>>
>> They are supposed to be safe for normal displays; that is anything
>> non-safe should be enclosed in an "if (mnDPIScaleFactor > 1)".  Few
>> cases make the computation a bit more general, like:
>>
>> +    long yOffset = (aRect.GetHeight() - mpImpl->maImage.GetSizePixel().Height()) / 2;
>> +
>>      if( mpImpl->mnState == SIGNATURESTATE_SIGNATURES_OK )
>>      {
>> -        ++aRect.Top();
>> +        aRect.Top() += yOffset;
>
> I’m wondering if this is the area I should focus on.
>
> I’m not entirely sure how the scaling factor is being worked out, we seem to do this in Window::ImplInit and Window::ImplInitResolutionSettings with the following calculation:
>
> mnDPIScaleFactor = std::max(1, (mpWindowImpl->mpFrameData->mnDPIY + 48) / 96);
>
> Does anyone know what the underlying theory is behind this calculation? 96 seems to be a hardcoded DPI value assumed for all screens, but I can’t quite work out where the 48 number comes from…

You need the scale factor for bitmaps (icons) and some other places in
the UI (mostly the places where we draw 1 pixel lines) because when
you increase the DPI everything becomes larger pixel wise but bitmaps
stay as they are. When you start to approach 192 DPI (2*96) we
increase the scaling factor and scale the bitmaps by the scaling
factor. 48 is there only so that we scale before we hit 192 DPI - at
around 144 DPI (however in the latest code this starts at 169 DPI).

OSX is however excluded from this - it does its scaling in the backend
AFAIK. (see WIndow::CountDPIScaleFactor)

> I’m also wondering if it might not be better for us to move this calculation out of Window and into SalGraphics, given it is the SalGraphics backend that really gives the DPI via GetResolution.

Yes, it would be better to do it in the backend I guess.

> Another thing is: we seem to have this idea of logical coordinates, as opposed to device coordinates all through OutputDevice, and also there is a way of setting the OutputDevice mapmode. I’ve never quite understood what the idea behind this is. Can anyone give me any insights into this?

OutputDevice backends work only with pixels - you can set the mapmode
to a logical mode and all the inputs can be in that logical
coordinates. OutputDevice will automatically convert them to pixels. I
don't like this however. I think this doesn't belong on the
OutputDevice and it just adds bloat - if we need something like this
then as a wrapper around OutputDevice that does this
(LogicalOutputDevice?).

> Chris
>
> P.S. I’ve just checked my Mac and the default scaling option is indeed lower than what I was expecting - the default on my Mac with a Retina screen is 2560x1440.
>
> Hint for OS X developers: to change actual screen resolutions, you need to:
>
> 1. Go to System Preferences
> 2. Go into the Display panel
> 3. In the Display tab, hold down the option key on your keyboard and click on the Scaled radio option
>
> This will give you the chance to set the actual screen resolution, as opposed to the more limited graphical option that OS X gives you.

Regards, Tomaž
_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice
Chris Sherlock Chris Sherlock
Reply | Threaded
Open this post in threaded view
|

Re: DPI and screen resolution on OS X


> On 5 Feb 2016, at 9:29 PM, Tomaž Vajngerl <[hidden email]> wrote:
>
> Hi,
>
> On Fri, Feb 5, 2016 at 2:02 AM, Chris Sherlock
> <[hidden email]> wrote:
>> There were a series of patches that handled hi-DPI displays in 2014 that Keith did for us and that were pushed by Kendy:
>
> Yes, and I continued later on with the patches when I got a laptop
> with HiDPI screen and running Fedora.

Thank you :-)

>>> I've just pushed a backport of the hi-dpi patches from master to gerrit
>>> for libreoffice-4-2 integration - as was requested earlier, to fix the
>>> unfortunate state of LibreOffice on the hi-dpi displays.  It is the
>>> following 5 patches (order is important):
>>>
>>> https://gerrit.libreoffice.org/#/c/8516/
>>> https://gerrit.libreoffice.org/#/c/8517/
>>> https://gerrit.libreoffice.org/#/c/8518/
>>> https://gerrit.libreoffice.org/#/c/8519/
>>> https://gerrit.libreoffice.org/#/c/8520/
>>>
>>> Keith confirmed that they fix the hi-dpi issues he was seeing in
>>> LibreOffice 4.2.
>>>
>>> They are supposed to be safe for normal displays; that is anything
>>> non-safe should be enclosed in an "if (mnDPIScaleFactor > 1)".  Few
>>> cases make the computation a bit more general, like:
>>>
>>> +    long yOffset = (aRect.GetHeight() - mpImpl->maImage.GetSizePixel().Height()) / 2;
>>> +
>>>     if( mpImpl->mnState == SIGNATURESTATE_SIGNATURES_OK )
>>>     {
>>> -        ++aRect.Top();
>>> +        aRect.Top() += yOffset;
>>
>> I’m wondering if this is the area I should focus on.
>>
>> I’m not entirely sure how the scaling factor is being worked out, we seem to do this in Window::ImplInit and Window::ImplInitResolutionSettings with the following calculation:
>>
>> mnDPIScaleFactor = std::max(1, (mpWindowImpl->mpFrameData->mnDPIY + 48) / 96);
>>
>> Does anyone know what the underlying theory is behind this calculation? 96 seems to be a hardcoded DPI value assumed for all screens, but I can’t quite work out where the 48 number comes from…
>
> You need the scale factor for bitmaps (icons) and some other places in
> the UI (mostly the places where we draw 1 pixel lines) because when
> you increase the DPI everything becomes larger pixel wise but bitmaps
> stay as they are. When you start to approach 192 DPI (2*96) we
> increase the scaling factor and scale the bitmaps by the scaling
> factor. 48 is there only so that we scale before we hit 192 DPI - at
> around 144 DPI (however in the latest code this starts at 169 DPI).

Sorry for asking silly questions, but I’m not entirely following. Why do we increase the scaling factor at 2*96, but then start scaling before we hit 192?

Just a little confused by this, I’m sure it’s something obvious I’m missing! :-)

> OSX is however excluded from this - it does its scaling in the backend
> AFAIK. (see WIndow::CountDPIScaleFactor)
>
>> I’m also wondering if it might not be better for us to move this calculation out of Window and into SalGraphics, given it is the SalGraphics backend that really gives the DPI via GetResolution.
>
> Yes, it would be better to do it in the backend I guess.

Cheers, I’ll put this on my list of things to do.

>
>> Another thing is: we seem to have this idea of logical coordinates, as opposed to device coordinates all through OutputDevice, and also there is a way of setting the OutputDevice mapmode. I’ve never quite understood what the idea behind this is. Can anyone give me any insights into this?
>
> OutputDevice backends work only with pixels - you can set the mapmode
> to a logical mode and all the inputs can be in that logical
> coordinates. OutputDevice will automatically convert them to pixels. I
> don't like this however. I think this doesn't belong on the
> OutputDevice and it just adds bloat - if we need something like this
> then as a wrapper around OutputDevice that does this
> (LogicalOutputDevice?).

So it’s better to create a decorator class, I’ve actually wondered if this might be a better solution. Good info!

Thanks Tomaz, appreciate the response!

Chris
_______________________________________________
LibreOffice mailing list
[hidden email]
http://lists.freedesktop.org/mailman/listinfo/libreoffice