Well, I use some generated/object, when I don’t feel like mixing over seams. (But those can be represented by a UV map anyways, just do a pair of orthogonal project from view, mappings that can give much more intuitive, artistic control than trying to perturb natural coords with math alone. UV maps don’t have to have any seams.) A box mapping won’t give you smooth procedurals either, no more than it will give smooth image textures across Suzanne. There are discontinuities (unless you’re texturing a box, in which case, UV also works fine.)
I’ve done a six-plane mix-by-normals before. That’s how I know you can’t use it for displacement output, learned the hard way
I think box-mapping via node-group can be done-- maybe not identically to Blender’s algorithm, but quite possibly, better. I’m going to keep on experimenting. I’m not sure that Blender’s box-mapping blends, but if it does, it does it only faintly, and you can do that by mixing output from multiple texture lookups, just like it looks like your three-way brick is doing.
Edit: Ooops! I was wearing blinders. I didn’t realize the “blend” was a slider. Does look just like mixing from multiple samples.
Box mapping looks to ignore custom normals, from modifiers or displacement output, so I think it is using position in texture space, not normal.
Edit2: Box mapping with blending in nodes, via mixing between multiple texture lookups. Compared with box blending via image texture node on right. Not exactly the same-- box map in texture lookup seems per-vertex then interpolated, not per-sample, but per-sample is better anyways. Can’t say I have the “curve” of the blend the same-- I almost certainly do not, I just used a power node to scale the weights of the texture lookups. Whole node setup is pretty messy, sorry, I’ve been experimenting a lot.