With people’s trust in journalism declining in recent decades, the media being out of touch with the real world, and social media-enabled citizen journalism having replaced professional journalism, it is clear that the profession is currently suffering an existential crisis. Is journalism still a profession? And if so, how should the media's model change to facilitate democracy? Does the media reflect society or has it turned into white elitist enterprises that advertise 'diversity' programmes as an alibi? Ultimately, is there any point in talking about journalism as we know it or should we be focused on rebuilding a new model, relevant to the contemporary needs of a society in a post-journalism era?