3D Blender character model & Unity issue

I’m asking this to hopefully assist my animator in finding a solution, so I’ll do my best to try to describe things accurately.
I’m doing a project in Unity, and he built a 3D humanoid character in Blender for that project. He’s been making animations for the 3D character in Blender, which have been great. However, I recently implemented a script-based approach to make the 3D character’s eyes and head track things in Unity (like the player) in 3D space. At first the eye & head tracking didn’t work at all when an animator controller was attached to the 3D character in Unity (with no animator controller the tracking worked fine). My animator sent me a new 3D model with clean data channels for the eyes and head, which allowed the script to control the eyes like I wanted (while simultaneously using an animator controller in Unity for torso animations), but the head still isn’t tracking like it should, and I’m worried I’ve lost the ability to have facial animations with this almost-working setup. I should mention that I’ve tried various setups with animation layers and override/additive avatar masks with no real improvement.
So my question: is there something my animator (or I) should ensure to do within Blender or Unity to allow the eyes & head to track things via script, while still allowing animations (both body & face) with an animator controller?
I’m open to overhaul suggestions as well, if there’s a solution that’s different than the setup we’re currently trying to use.
Thanks in advance!