You can use resolution through drivers. Right click on the field you want to depend on resolution (blur size for example) and choose Add Driver > Manually create later. Then from Curve Editor window, change window subtype to Drivers, find the driver from lefthand tree view and click on it. I the righthand properties panel (N shorcut key), choose the Drivers tab and in there set it up as follows:
- Type: Scripted Expression
- Expr: the expression you want, for example res_x*0.05 (res_x is the variable name here, see below)
- under variables list change the variable type to Single Property and give it a meaningful name, res_x for example
- in variable Prop field, change type to Scene and select your active scene in combobox
- copy the resolution path (X or Y dimension, whichever you need) by right-clicking in Render panel on the render resolution value and choose Copy Data Path
- paste the previously copied data path to the Path field in variable panel
- if everything is done correctly, your render resolution now drives your value.
For drivers to work you must enable python scripts autorun in user preferences. Open the preferences and in File tab, check the Auto Run Python Scripts checkbox.
This is a logic I personally strongly dislike. Fusion uses this, naming it resolution independence, but what it actually is is total brainf*ck. It is much more flexible to give precise values in pixel dimensions and expose different variables through expressions, drivers or whatever. If I need to depend on resolution I can hook it up in an expression, but most of the time I don’t need this and it will be very counterintuitive to use something like 0.003% in transform or blur instead of 10.5 px. Pixel size value is an exact, quantifiable unit, resolution dependent percent is not. Just as saying that something is a 300dpi image says nothing about its actual size or quality.