Daniel B. Wallace on the Textual Reliability of the New Testament

Dan Wallace refutes the notion that we can’t know what the original biblical texts say (as suggested by Dan Brown’s The DaVinci Code).  The context of his comments is that he’s rebutting the views of Bart D. Erhman.  Wallace’s comments on this subject are broken into the two video clips you see below.

Moreover, beginning at 00:50 in this first clip, Dan Wallace says:

“When the King James translators did their work, they based their New Testament on essentially six New Testament manuscripts, the earliest of which came about 500 years before the King James was produced.  Now we have, almost 400 years later, we have about one thousand times as many manuscripts as they had.  And our earliest don’t come 500 years earlier, our earliest manuscripts go all the way down to the 2nd Century.  It’s a huge difference.  As time goes on, we’re actually getting closer to the original both in numbers of manuscripts and in the dates of those manuscripts.”

In the second clip, Wallace points out how much more sure we can about what the New Testament says than we can be about any other ancient text.  “Conjectural emendation” (which must be applied when there there is no textual evidence) are commonplace in textual criticism of ancient literature other than the New Testament; for the New Testament it is not necessary because there textual evidence for every word.  Thus if we say that we can’t be sure about what the New Testament says, then, in order to be consistent, we would have to say that we are even less sure of all the other writings of antiquity.  That means Plato, Aristotle, Herodotus, Thucydides, Cicero, Tacitus, Julius Caesar, and all the rest!  (Though he only implies as much in these videos, Wallace has certainly been this explicit about it elsewhere.  There are many videos of his presentations on the internet.)

2 Replies to “Daniel B. Wallace on the Textual Reliability of the New Testament”

Leave a Reply

Your email address will not be published.