Wikimedia Foundation suggests that machine translation is the kind of infrastructure that makes sense to it. Given what it aims to do: making knowledge available to everyone, this makes perfect sense. A lot of translation has already been going on in order to fill many gaps in the many Wikipedias and machine translations were often an important part of this.
One of the arguments why the WMF could enter the fray is that it has something to add. It does have monetary reserves but more importantly it has several resources that may make a difference. The biggest two are Wikipedia itself and the other are its awesome communities.
When translating a Wikipedia article, the concepts that are specific to a subject are likely to be found in that article. Similarly when such concepts have their own article, they will contain a similar set of concepts. Combine this with a multi-lingual dictionary build with Wikidata technology along OmegaWiki lines and it will be relatively easy to find the corresponding expressions in articles in different languages on the same subject.
The point here is that the meanings of words do not exist in a vacuum.
When such concepts have been identified and linked to Wikipedia articles and dictionary meanings it becomes possible to help people understand a text in a different language by providing native language support.
<GRIN> I know Erik Moeller has a lot of experience in this field and I know mutual friends are quite interested to help </GRIN>
Relevant is that we do not have to invent something new; it has been part and parcel of things we have done before. The difference is that we gained in experience and, technology has evolved as well.