11/10/2022 0 Comments Security through obscurity![]() That need not be the case, the report maintained. Suppliers’ cyber security best practices as important as costĪ common criticism of security is it slows innovation by increasing the time it takes for development teams to produce software. To avoid that and promote collaboration, HackerOne recommended encouraging third parties to report vulnerabilities, setting up regular security briefing sessions with company brass, and translating security risk into risk to the business. #Security through obscurity softwareSixty-seven per cent said they'd rather accept software vulnerabilities than work with hackers, while 50 per cent of hackers admitted they hadn't disclosed a bug because of a previous negative experience or the lack of a channel to report it.Ī lack of trust makes everyone a potential cyber enemy, the report maintained. The report also revealed a lot of distrust between organisations and third-party researchers. To create greater transparency, the report recommended building a culture of openness, avoiding assigning blame when incidents happen, providing third-party researchers with a clear process for reporting vulnerabilities, and taking an open approach to stakeholders should a breach occur. Not admitting weaknesses and asking for help fixing them can cause significant damage to a brand should a "secret" vulnerability be exploited, the report explained. If you actually typed alt alt ^D, that set a flag that would prevent patching the system even if you later got it right.Distrust between organisations and third-party researchersĪccording to survey data gathered for the report from 800 security leaders, 64 per cent maintain a culture of security through obscurity. One instance of *deliberate* security through obscurity is recorded the command to allow patching the running ITS system (altmode altmode control-R) echoed as $$^D. #Security through obscurity how toIn the ITS culture it referred to (1) the fact that by the time a tourist figured out how to make trouble he'd generally got over the urge to make it, because he felt part of the community and (2) ( self-mockingly) the poor coverage of the documentation and obscurity of many commands. ![]() ITS fans, on the other hand, say it was coined years earlier in opposition to the incredibly paranoid Multics people down the hall, for whom security was everything. ![]() It has been claimed that it was first used in the Usenet newsgroup in news: during a campaign to get HP/Apollo to fix security problems in its Unix- clone Aegis/DomainOS (they didn't change a thing). ![]() ![]() Historical note: There are conflicting stories about the origin of this term. After all, actually fixing the bugs would siphon off the resources needed to implement the next user- interface frill on marketing's wish list - and besides, if they started fixing security bugs customers might begin to * expect* it and imagine that their warranties of merchantability gave them some sort of rights. This never works for long and occasionally sets the world up for debacles like the RTM worm of 1988 ( see Great Worm), but once the brief moments of panic created by such events subside most vendors are all too willing to turn over and go back to sleep. A term applied by hackers to most operating system vendors' favourite way of coping with security holes - namely, ignoring them, documenting neither any known holes nor the underlying security algorithms, trusting that nobody will find out about them and that people who do find out about them won't exploit them. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |