Could Semantic Wiki help improve Machine Translation by allowing machines to not only know about human languages, but also know about the world that humans talk about when using that language?
Please tell us what you think about that below.
There are three wiki projects that have semantic web aspects that I know off. There is dbpedia, semantic mediawiki and OmegaWiki. All three do things in a different way and, it makes sense when these three are allowed to cooperate. The first two have no awareness at the moment about language, the third does. The relations known by dbpedia and particularly semantic mediawiki are essentially monolingual. Through the categories and the wiki links terminological information can be derived, a project that "Duesentrieb" is working on at the moment. With this awareness many of the homonym issues can be anticipated.
In OmegaWiki concepts are now linked through "classes and class attributes". As concepts are also linked to encyclopaedic content (ie Wikipedia) the essential information to link OmegaWiki to dbpedia content becomes available. This in turn allows for understanding the relations data mined from Wikipedia to be understood in a multi-lingual way. There will be issues in doing this but, these issues have always been there anyway and they are the essence why machine translation is not straight forward.
When machines are to know about the world that humans about when using a language, it will be necessary to not only know the semantic relations that we can agree on. We will also have to be able to indicate the relations we do NOT agree on. Many of these relations will be situational. As long as semantic wiki and web for that matter only considers global truisms, it will not help translation really. The devil is in the detail and the relations are in your face ie in the document in front of you.