I've built Bayesian non-parametric methods that performed inference on certain formulae, FOPL subsets or even Turing-complete programs. IMHO, it's a very exciting field that will bloom midterm.
I know we use ML to "grease the wheels" of inference; i.e., Cyc gains an intuition about what kinds of paths of reasoning to follow when searching for conclusions. I don't know of any higher-level hybridization experiments; I think we only have one ML person on staff and mostly our commercial efforts focus on accentuating what we can do that ML can't, so we haven't had the chance to do many projects where we combine the two as equals.
"Cyc gains an intuition about what kinds of paths of reasoning to follow when searching for conclusions"
The possible paths come purely from symbolics. But that creates a massive tree of possibilities to explore, so ML is used simply to prioritize among those subtrees.
Basically you are learning the heuristic? Do you have any public information on that? That something I have always wanted to work on and really think it could be a shortcut to AGI...
> I don't know of any higher-level hybridization experiments
That contradiction and the admission that Cyc only has "one ML person on staff" signals to me, an outsider, that the belief of parity between Machine Learning and "Symbolic" might be predicated more on faith than on reason.
I would say "more on theory than on empirical evidence". It's entirely reasonable; the way your eye "thinks" is entirely different from how your higher cognition "thinks", but you need both. If you want something more concrete, here's a recent experiment done by MIT in this realm:
We aren't an ML shop ourselves; we don't claim to be. Given that we have around 100 people, we focus on what we have that's special instead of trying to compete in an overcrowded market. The idea of hybrid AI is something we see as a future part for us to play in the bigger picture of machine intelligence.
Wow, I should have read further ahead in the comments, before dumping my first thoughts [1] as a standalone.
How do you interface between the distinct parts of your machinery? Do you use deeper level neural network representations/activities as symbol embeddings?
We're beginning to run up against what I may not be allowed to talk about :)
But I will affirm that Cyc is fundamentally symbolics-based. We don't position ourselves as anti-ML, because it's seriously good at a certain subset of things, but Cyc would still be fully-functional without any ML in the picture at all.