We have run into some question here, because we use impact factor as an initial screen on submissions in /r/science. We set a threshold of 1.5, which allows most science through, while stopping some of the worst quality journals. We have had to do this because we frequently don't have mods available who are familiar enough with a specific sub discipline to determine if a journal is legitimate or fake. Is there a better way that we could apply a quick screen for legitimacy on journal articles, similar to how impact factor works?