The long answer would be very long, but the short one is short enough: Your question 'begs the question'; it is simply not true that 'dependency gets the job done more easily and economically'. What's more, it does not get the job done at all (if by the job we mean the job the best constituency-based grammars already do, and with great precision, as a matter of course).
A (syntactic) 'dependency' is just a syntagmatic relation between two minimal signs (under DG assumptions), and, of course, to that extent, DG's basic 'tool' is extremely flexible (so are part-whole relations, though), and that's why it has usually attracted the attention of computational linguists, rather than'pure' linguists, but most syntagmatic relations are irrelevant, why they are irrelevant must be explained, irrelevant dependencies should not be represented in DG diagrammes if they were really minimal and as efficient as you claim (but they are represented, and the diagrammes are not minimal at all), and those dependencies considered relevant to DG must still be properly defined, because, as far as I know, even 'subject of', 'object of', 'adjunct of' (and, correspondingly, 'predicate of', 'argument of', 'Agent of', 'Patient of', etc.) have not yet been properly defined in DG.
To illustrate with just an example, in My wife has told our eldest daughter to clean the kitchen after today's party (a slightly more elaborate sentence than one cited in the Wikipedia article on 'Catena') wife is not the 'subject', nor daughter the 'object', nor to the 'complement' of has (or told, for that matter). Needless to say, neither is kitchen the 'object' of clean, nor party the 'complement' of after, nor after the 'adjunct' of clean, etc. etc.
If DG wants to call such dependencies 'subject', 'object', etc. it must develop a completely new theory of syntactic functions, and that, of course, will be at a heavy price. I suspect that even if that were possible at all, which I doubt, the resulting theory would certainly not be simpler, nor more efficient, than the extant constituency-based one, on the contrary.
Correspondingly, at the semantic level, wife, daughter, to, kitchen, party, etc. are not 'arguments' of their respective '(maximally unsaturated) predicates' has/told, clean and after, either, and calling them 'arguments', and assigning 'semantic roles' to them will require massive adjustments in semantic theory and its associated ontology. Just think of the fact that arguments (in the sense we are using the term now) must be referential, i.e., their names must denote (first-order) entities, whereas, of course, wife, daughter, kitchen, party, etc. do not denote individuals at all.
As far as I can see, that, by itself, is an unsurmountable difficulty, but there is more: then, DG will need 'correspondence rules' to 'link' syntactic to semantic terms, and, to my knowledge, those have not been developed either.
Note that such objections emerge even at the most elementary level of metatheoretic analysis, in the simplest grammatical sentences, but there is rather more than that to account for in a natural language, isn´t there? There is, at least in many languages, an extremely subtle 'word order' (just think of the ordering restrictions among the more than forty different classes of adverbs that Cinque has identified), and, of course, there is 'scope','displacement', 'discontinuity', 'island (accessibility) constraints', 'minimality effects', 'superiority effects', 'reconstruction effects', and there is 'binding', and 'control', etc., etc., etc.
DGs seem to 'work' (of course, only as elementary parsing strategies, ignoring all the above-mentioned difficulties) because they are typically used to parse only the simplest well-formed sentences, as happens in CL work, i.e., they are 'restricted prototypes', trivial toys, at bottom, but a respectable linguistic theory must also explain why ill-formed sentences (or interpretations thereof) are ill-formed, and to do that you must do rather more than draw or not draw arcs between signs more or less at will.
You do not account for wh-movement by just drawing an arc between What and buy in What did he say he wanted to buy?, nor explain the ungrammaticality of '*Who did you say that wanted the job?' by not drawing any arc between who and wanted, nor account for the twofold dependency between the subject He and the two verbs has and playing in He has been playing the guitar by drawing a couple of arcs between He and -s and play, nor predict the binding phenomena by drawing another arc between He and himself in He promised Tom to do it himself or omitting it between He and him in He promised him to do it himself, etc.
I could go on like this more or less indefinitely, in virtually any aspect of a proper syntactic-semantic theory, but will not. The answer to your question is very simple: DGs cannot oust state-of-the-art constituency-based grammars because they are neither empirically comparable to them in any respect, nor conceptually or representationally any simpler than they are (once all the auxiliary assumptions needed to reconcile DG analyses with the empirical facts are properly spelled out). So far, they have not been, and with good reason: DG is still resting on theories of syntactic and semantic functions that are incompatible with it.
Of course, going any further is out of the question: even 'translating' the principles of current CG-based grammars into a framework that denies the existence of phrasal dependents (in spite of the fact that there is unquestionable evidence that natural language is structure-dependent) may well be literally impossible, as partly explained above, and as to DG's alternatively developing equivalent principles of its own to properly handle the empirical effects of constituency, allowed and disallowed displacement, correct and incorrect scope, minimality, superiority, island constraints, correct and incorrect binding, control, ellipsis, deletion and empty category effects, etc., well, I will not say it is 'impossible', but none of that remotely exists yet and everything suggests that if a proper theory of 'all that' ever comes to be developed from a strict DG point of view, it will be rather more stipulative and complex, and rather less efficient, than state-of-the art constituency-based grammars.
You are welcome to explore and try to promote any new theory you fancy, but it is pretentious of DG fans to design little toy grammars and ignore the enormous body of knowledge that CGs have managed to offer us after sixty years of colossal intellectual work by thousands of the best linguists the world has ever produced.
In closing, I would like to expand my critical remarks by elaborating on a simple, but powerful (I hope) would-be conceptual argument against Dependency Grammar as programmatically presented in its foundational manifestos. It goes, more or less, as follows:
The branches of a traditional constituency-based phrase structure tree represent syntactic relations between the mother node and its daughter nodes. As a consequence, if the sentence S (say: John sent Mary flowers) is (just for the sake of argument!) analysed as a 'flat' structure with four branches J + s + M + f, there must be four relevant syntactic relations between the daughter nodes and its mother node. This is, indeed, the case: the relevant syntactic relations are Subject of (S) = John, Head of (S) = sent (let's not bring Infl or T into the picture here, OK?), Indirect Object of (S) = Mary, and Direct Object of (S) = flowers.
Alternatively, if we opted for a binary-branching analysis, as in current Merge-based theories, the sentence S would have just two branches, which, ignoring the actual labels now used, I will simply call Subject of (S) = John and Predicate of (S) = sent Mary flowers.
However, if we had not specified that (under that analysis!) the mother node must be the pivot of such syntactic relations, our initial flat tree would also represent irrelevant syntagmatic ‘connections’, mediated by the mother node S, between J and M, J and f, or M and f to which no known relevant syntactic function can be assigned.
Since CG rests on part-whole relations, that constraint follows from the CG approach as an inherent property and no conceptual problem arises, even if we opt for the flat analysis, and, of course, does not arise at all if we opt for a binary-branching one (which can be turned into an argument for binary-branching analyses, by the way).
Now, suppose we say with Dependency Grammars that the phrasal ‘mother’ node is irrelevant (i.e., that part-whole relations are irrelevant to syntax) and that the only syntactic relations that count are ‘part-part’ ones between the elements of John sent Mary flowers, i.e., J, s, M and f. That hypothesis predicts the existence of relevant syntagmatic relations between 1) J and s (or s and J), 2) M and s (or s and M), and 3) f and s (or s and f), and let’s grant that, in this simple case, it is possible to label them, respectively, ‘subject of’, ‘indirect object of’, and ‘direct object of’ sent. [The opposite strategy, to say that 'verb of'(J)= sent & 'verb of (M) = sent & 'verb' of (f) = sent, would leave us in the dark as to the functions of J, M and f, and, of course, would be too uninformative as to be worth considering].
Such a flat analysis, however, also predicts the existence of additional syntagmatic relations between 4) J and M (or M and J), 5) J and f (or f and J), and 6) M and f (or f and M) for which no known syntactic function label is available (a problem that would not have arisen under a binary-branching analysis of S, recall).
The way to account for the irrelevance of the syntagmatic relations 4), 5) and 6) is to stipulate that only relations in which the verb is involved count (for clause-level syntax). In other words, there must be a special designated term which somehow acts as the ‘pivot’, ‘head’ or ‘root’ of the whole structure, and that, under DG assumptions, is, in this case, the verb sent.
However, for as long as ‘the whole structure’ is not recognized as a relevant syntactic object at all, there is no way to even formulate that property of sent: what is sent the 'root' of in the proposition Head/Root of (?) = sent? What is ? in such an equation?
Of course, it must be the ‘phrase’ S, in this case, ergo phrasal nodes must be syntactically relevant categories or it would be impossible to define sent as the ‘root’ or ‘head of’ anything. Q.E.D.
If I am not terribly mistaken, this is a simple, but valid, conceptual argument against all DG theories to the extent they remain faithful to their foundational manifestos. If they withdraw their foundational claims and admit that phrases are syntactically relevant objects and terms of bona fide syntactic dependencies, of course, this argument no longer applies, but, if they do, DGs can hardly be as perspicuous and efficient as CGs. Note that whereas in CG approaches, say X-bar or Merge-based syntax, the theory automatically defines a transparent correspondence between phrases and their heads (if we know the label of the head, we know the label of the phrasal node, and viceversa), in DG such correspondences must be stipulated (as happened in early PS rules!).
This seems to me a simple, but cogent, reason to challenge the alleged superiority of DGs over state-of-the-art CGs, even if we ignore the theoretical vacuousness of DGs in chapters as important as those mentioned above and compare them to CGs only as mere diagramming tools.
[Just in case you think I have an axe to grind in this matter, let me tell you that I am by no means an orthodox Chomskyan linguist; I'm just a self-taught linguist and an intellectually open scholar who has bothered to take a good look at many other people's gardens and know very well what it has taken LFG, GPSG-HPSG, CG, OT, FG, Word Grammar, Cognitive Grammar, FDG..., you name it, to barely mimick in their own terms just the most flagrantly necessary components of what Chomskyan P&P Theory had already achieved thirty years ago. What is a pity is that the subsequent 'minimalist programme' has largely repudiated much of that work and taken refuge in a ridiculously restricted concept of Human Language as, basically, free recursion, but that is a different matter].