Linear regression with errors in both variables is a common modeling problem with a 100-year literature, yet we have still not achieved the widespread use of a complete and correct solution. Much of this is due to a confusion between joint and conditional modeling and an unhealthy aversion to priors. This paper expands on the proper Bayesian method of Zellner (1971) and Gull (1989), deriving specific parameter estimators and giving an analysis of their performance. They are shown to perform favorably compared to total least squares.
(This is an "executive summary", which only covers the univariate case and does not give all equations and derivations. The longer version is in limbo until I find time and interest to finish it.)