Jump to content

Change display resolution in run-time


JohnLKW

Recommended Posts

Hi

I am working on a HDMI driver with ugfx. Because the resolution of a monitor is not fixed, I will have to develop a driver that read EDID information from a monitor and divert the h_resolution and v_resolution together with porch information to ugfx. EDID parser part is working nicely. But I have a problem with ugfx lld driver. This should be possible with LLDSPEC bool_t gdisp_lld_init(GDisplay *g)  inside gdisp_lld_xxx.c file but the problem is that, even though I can declare GDISP_SCREEN_HEIGHT and GDISP_SCREEN_WIDTH as variables these variables don't seem to have function that exposed to the main for run-time. Instead, the function I can call is gfxInit() with no parameter passing to it.

Any idea is welcome.

 

John

Link to comment
Share on other sites

Hello @JohnLKW and welcome to the µGFX community!

GDISP_SCREEN_WIDTH and GDISP_SCREEN_HEIGHT are macros (C-defines). They get replaced by the pre-processor prior to compilation. Therefore, you can't change their values during runtime.
Have a look at the pixmap driver. The pixmap driver allows setting a resolution during runtime as well (upon calling gdispPixmapCreate()). I guess that should help you.

Don't hesitate to ask if you have any further questions. We're happy to help wherever we can.

Link to comment
Share on other sites

Hi Joel

I've took a second thought on my problem and find my system may not strictly require resolution change in run-time. The scenario is like a PC connecting to monitors of different resolution. Monitor A has 1920x1080 preferred resolution but monitor B may have a 1280x1024 resolution. Fortunately my target system allows a system reboot when monitor is changed. That means it is only required to substitute the right resolution inside gdisp_lld_init(GDisplay *g) in the startup sequence. Therefore it is not strictly a change in run-time. It is just a startup parameter to be changed.

Thanks for the gdispPixmapCreate() suggestion and it is definitely useful for my work.

Best Regards

John

 

Link to comment
Share on other sites

4 hours ago, Joel Bodenmann said:

GDISP_SCREEN_WIDTH and GDISP_SCREEN_HEIGHT are macros (C-defines). They get replaced by the pre-processor prior to compilation. Therefore, you can't change their values during runtime.

Depends on the underlying code, but it may be possible to define these macros as being functions rather than constants. (Has worked for me on other code). Then they can return appropriate values on demand.

Link to comment
Share on other sites

@JohnLKW: "Run Time" refers here to the run time of the program. Basically the time during which the code is being executed. There's compile-time and run-time. Although the gdisp_lld_init() doesn't get called directly during the execution of your own program it's still something that happens during the run-time (execution-time) of the program.

@steved: While that would/might work as those are only used by drivers internally it would still be hacky/ugly. Those two macros are part of the public interface which allow the user to set the resolution of drivers which already support that (eg. the Win32, X and SDL drivers). The driver can simply ignore them but the user must be able to #define them as integer values prior to compilation. Otherwise some code will not build properly. Again: It would work, but then it won't work in some cases. They were not meant to be used like that so the proper way is not using them like that :D
All other parts of the µGFX code rely on gdispGetWidth() and gdispGetHeight() which will simply access the Width and Height struct fields in the GDisplay struct. That's all that gdispCreatePixmap() is doing in order to support setting a resolution during run-time. No other magic involved.

Link to comment
Share on other sites

The two GDISP_SCREEN_XXX macros are pretty much obsolete. They may be used inside a driver to initialise the display size eg Win32 but that is not required.

What is important is the setting of the g->g.Width and g->g.Height members in the gdisp_lld_init routine.

Once initialised there is no api support for changing the resolution thereafter although something could be coded using a special GISP_CONTROL code. You would however need to be very careful with framebuffer reinitialisation and things like that.

Link to comment
Share on other sites

@Joel Bodenmann :That's all that gdispCreatePixmap() is doing in order to support setting a resolution during run-time.

@John : Sorry that I still cannot get this point. I do understand virtual display and gdispCreatePixmap() can create a surface of an arbitrary size in memory. But this surface needs to be copied to the physical display at the end. The problem is that we have different physical display size ranging from 640x480 to 1920x1080.

@inmarket : What is important is the setting of the g->g.Width and g->g.Height members in the gdisp_lld_init routine.

@John : That is what I am trying to set. Code snippets below is working. When a monitor of 1920x1080 is connected, gdispGetWidth() & gdispGetHeight() returns 1920 & 1080, respectively.

//...

#ifdef GFX_USE_HDMI

struct MonitorInfo *monitor; /* monitor is a structure defined to describe an edid array fetched from a monitor */

#endif

//...

LLDSPEC bool_t gdisp_lld_init(GDisplay *g) {
 
 // Initialise the board interface incl. hardware reset etc.
 hal_begin();

 /* Initialise the GDISP structure */

#ifdef GFX_USE_HDMI 
 g->g.Width = GDISP_SCREEN_WIDTH = monitor->detailed_timings[0].h_addr;
 g->g.Height = GDISP_SCREEN_HEIGHT = monitor->detailed_timings[0].v_addr;
#else
 g->g.Width = GDISP_SCREEN_WIDTH;
 g->g.Height = GDISP_SCREEN_HEIGHT;
#endif

 g->g.Orientation = GDISP_ROTATE_0;
 g->g.Powermode = powerOn;
 g->g.Backlight = GDISP_INITIAL_BACKLIGHT;
 g->g.Contrast = GDISP_INITIAL_CONTRAST;
 
 return TRUE;
}

 

In main I wrote:

while(1){

//... some more init code

    gfxInit();
    coord_t    width, height;
    coord_t   i, j;
    // Get the screen size
    width = gdispGetWidth();
    height = gdispGetHeight();
    Serial.println(width, DEC);  //is able to print 1920
    Serial.println(height, DEC); //is able to print 1080

}

 

This code is working! But there is another problem. The time to run gfxInit() takes an incredible amount of time to finish (around 10sec).

I need to wait such time to see 1920 and 1080 from the serial monitor.

My question is: what functions are involved in gfxInit() besides getting LLDSPEC bool_t gdisp_lld_init(GDisplay *g) called? I am tracing it down to the line gdriverRegister(&GDISPVMT_OnlyOne->d, 0) in file gdisp.c but can't understand what it is.

 

John

Link to comment
Share on other sites

gfxInit calls _gdispInit which calls gdriverRegister which calls _gdispdriverinit (I am uncertain of this functions name as i am cuurently not at my computer) which then calls gdisp_lld_init.

gfxInit initialises the entire uGFX system.

_gdispInit initialises the GDISP module. For each display driver it calls gdriverRegister. 

gdriverRegister allocates the driver structure and adds it to the driver chain.

_gdispdriverinit is then called to initialise all the common gdisp components within the driver structure. It in turn calls gdisp_lld_init

gdisp_lld_init in the driver initialises the driver and the hardware.

There is also a post init sequence of calls to enable the driver to change to non-init bus speeds. Most drivers don't use this however.

The source of your delay you will need to find by debugging. It is likely in your driver as initialisation is very quick except where delays are introduced by drivers waiting for hardware.

The only other source of delay in gfxInit is the startup logo. That starts as soon as each display adaptor initialises and ends a few seconds after all displays are initialised. This delay is obviously intentional and can be turned off.

Link to comment
Share on other sites

  • 2 weeks later...
On ‎4‎/‎22‎/‎2017 at 06:21, inmarket said:

The only other source of delay in gfxInit is the startup logo. That starts as soon as each display adaptor initialises and ends a few seconds after all displays are initialised. This delay is obviously intentional and can be turned off.

Besides the startup logo is it the hidden screen clear operation to black color causing the delay? No startup logo has been enabled before. I have found that when screen resolution is set to 320x240 delay is very short. However when resolution changed to 1280x720 delay spans as long as 10 seconds. This is because I didn't implement a hardware screen clear few days ago.

Now I have implemented screen clear by 2D acceleration and the problem of delay startup has been solved. So I believe the delay was caused by a stream write iteration hidden in the init function somewhere. But I am not sure since no relevant code for screen flush can be found yet.

On ‎4‎/‎22‎/‎2017 at 06:21, inmarket said:

The only other source of delay in gfxInit is the startup logo.

This is exactly what I am asking in this post. On top, string draw is also a problem now.

 

Edited by JohnLKW
Link to comment
Share on other sites

The first thing that happens with any display is that the screen is cleared regardless of whether the startup logo is displayed. That is why implenting the hardware accelerated clear helped.

Any time you are doing a screen as large as 1920x1080 you are talking about a LOT of pixels. Acceleration functions are absolutely mandatory especially on slow cpus.

Make sure you implement all the ugfx acceleration functions, particularly fillarea and blit.

Text drawing is a complex operation and is not the fastest operation. On one hand uGFX uses bitmap fonts which are much faster than rendering scaled fonts like truetype. On the other hand uGFX does not cache rendered fonts because it assumes it doesn't have the RAM to do so (and won't on most embedded systems). This then makes it slower than cached truetype.

The only thing that can be done currently is to implement the deiver acceleration functions. It is currently too big a job to add another font engine (like freetype) when that font engine will be of no use to 99% of the uGFX user base because of its code and RAM requirements. 

Link to comment
Share on other sites

9 hours ago, inmarket said:

The only thing that can be done currently is to implement the deiver acceleration functions. It is currently too big a job to add another font engine (like freetype) when that font engine will be of no use to 99% of the uGFX user base because of its code and RAM requirements.

I think that the deal here would be to use the hardware font rendering offered by the RA887x chip. You can give it some font files which it stores in its own flash and then you simply tell it where to render what font with what color. So it's the same deal as with the other hardware accelerations for the shapes: It's a matter of bypassing the µGFX font rendering and sending the proper commands to the RA887x chip to let it do the job. However, this one might be a bit more complex as you have to load the font at some point into the font memory of the chip. Using the GDISP control interface might make sense here.

Link to comment
Share on other sites

On ‎5‎/‎1‎/‎2017 at 17:22, Joel Bodenmann said:

I think that the deal here would be to use the hardware font rendering offered by the RA887x chip. You can give it some font files which it stores in its own flash and then you simply tell it where to render what font with what color. So it's the same deal as with the other hardware accelerations for the shapes: It's a matter of bypassing the µGFX font rendering and sending the proper commands to the RA887x chip to let it do the job. However, this one might be a bit more complex as you have to load the font at some point into the font memory of the chip. Using the GDISP control interface might make sense here.

In this case do you mean that I have to call my specific string draw function like hal_drawString(args)  instead of gdispDrawString(args)? I cannot find any directive to bypass uGFX font rendering like GDISP_HARDWARE_xxx.

Where to find more documentation on GDISP control interface?

 

Edited by JohnLKW
Link to comment
Share on other sites

Unfortunately it is a lot more complex than that. In order to be useful, the gdispDrawString() (and similar) calls still need to have their effect as the higher-level items such as the widgets and user applications depend on them. This means that the font rendering stuff needs to be intercepted below that level. Basically you'd have intercept between the GDISP layer itself and the font rendering engine. However, this gets complex because you have no longer control over the fonts, unless you can load them dynamically to the font engine chip through the gdispOpenFont() function and the main problem is that you have to reproduce the exact behavior in terms of color filling, justification and so on.

Link to comment
Share on other sites

Pursuing this would be very difficult currently. Replacing the font engine (even with a hardware based engine) would require major changes to lots of uGFX code and in the most complex area of uGFX.

If you really need to do it let me suggest that this not be attempted in V2.x. Wait until V3.0 is released and then we can start this conversation again as it would need to be reimplemented for V3.0 anyway.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...