Posts by chrissmith

    Cybersecurity Leaders Launch Initiative for Interoperable Security Technologies to Thwart Attacks



    • Open Cybersecurity Alliance to connect the fragmented cybersecurity landscape with common, open source code and practices 
    • IBM Security and McAfee join with Advanced Cyber Security Corp, Corsa, CrowdStrike, CyberArk, Cybereason, DFLabs, EclecticIQ, Electric Power Research Institute, Fortinet, Indegy, New Context, ReversingLabs, SafeBreach, Syncurity, ThreatQuotient, and Tufin to collaborate on new initiative at OASIS

    WASHINGTON, Oct. 08, 2019 (GLOBE NEWSWIRE) -- Borderless Cyber -- Today, the OASIS international consortium announced an industry initiative to bring interoperability and data sharing across cybersecurity products. With initial open source content and code contributed by IBM Security and McAfee, and formed under the auspices of OASIS, the Open Cybersecurity Alliance (OCA) brings together organizations and individuals from around the world to develop open source security technologies which can freely exchange information, insights, analytics, and orchestrated responses.


    According to industry analyst firm Enterprise Strategy Group, organizations use 25 to 49 different security tools from up to 10 vendors on average, each of which generates siloed data. (Cybersecurity Landscape: The Evolution of Enterprise-class Vendors).


    Connecting these tools and data requires complex integrations, taking away from time that could be spent hunting and responding to threats. To accelerate and optimize security for enterprise users, the OCA will develop protocols and standards which enable tools to work together and share information across vendors. The aim is to simplify the integration of security technologies across the threat lifecycle – from threat hunting and detection, to analytics, operations and response -- so that products can work together out of the box.


    The purpose of the OCA is to develop and promote sets of open source common content, code, tooling, patterns, and practices for interoperability and sharing data among cybersecurity tools. For enterprise users, this means:

    • Improving security visibility and ability to discover new insights and findings that might otherwise have been missed;
    • Extracting more value from existing products and reducing vendor lock-in;
    • Connecting data and sharing insights across products.

    Founders of the Alliance, IBM Security and McAfee, are joined in the initiative by Advanced Cyber Security Corp, Corsa, CrowdStrike, CyberArk, Cybereason, DFLabs, EclecticIQ, Electric Power Research Institute, Fortinet, Indegy, New Context, ReversingLabs, SafeBreach, Syncurity, ThreatQuotient, and Tufin. The OCA welcomes participation from additional organizations and individual contributors.


    “Today, organizations struggle without a standard language when sharing data between products and tools,” said Carol Geyer, chief development officer of OASIS. “We have seen efforts emerge to foster data exchange, but what has been missing is the ability for each tool to transmit and receive these messages in a standardized format, resulting in more expensive and time-consuming integration costs. The aim of the OCA is to accelerate the open sharing concept making it easier for enterprises to manage and operate.”


    “When security teams are constantly spending their time manually integrating tools and maintaining those integrations, it’s not helping anyone other than the attackers,” said Jason Keirstead, chief architect, IBM Security Threat Management. “The mission of the OCA is to create a unified security ecosystem, where businesses no longer have to build one-off manual integrations between every product, but instead can build one integration to work across all, based on a commonly accepted set of standards and code.”


    “Attackers maximize damage by sharing data with one another. Our best defense strategy is to share data too," said D.J. Long, vice president business development, McAfee. "The OCA creed is 'Integrate once, reuse everywhere' which builds upon McAfee’s open philosophy that led to the OpenDXL project in 2016. Organizations will be able to seamlessly exchange data between products and tools from any provider that adopts the OCA project deliverables. We're looking at the potential for unprecedented real-time security intelligence.”


    Go here to see additional quotes from the OCA sponsoring organizations.


    Initial technology contributions to the open project are as follows, with additions expected as part of ongoing work:

    • STIX-Shifter (from IBM Security): This project aims to create a universal, out-of-the box search capability for security products of all types, by providing a way to connect security products to other security, cloud, and software data repositories via a standardized cybersecurity data model (STIX 2). STIX-Shifter is an open source library which can identify information about potential threats within a wide variety of data repositories and translate it into a format that can be digested and analyzed by any security tool that has this standard enabled.
       
    • OpenDXL Standard Ontology (from McAfee) focused on the development of an open and interoperable cybersecurity messaging format for use with the OpenDXL messaging bus. The OpenDXL Standard Ontology will be offered under the Apache 2.0 license.

    To learn more visit www.opencybersecurityalliance.org.


    About OASIS

    One of the most respected, member-driven standards bodies in the world, OASIS offers projects – including open source projects – a path to standardization and de jure approval for reference in international policy and procurement. OASIS has a broad technical agenda encompassing cybersecurity, privacy, cryptography, cloud computing and IoT – any initiative for developing code, APIs, specifications or reference implementations can find a home at OASIS.


    Joe Eckert for Open Cybersecurity Alliance

    Eckert Communications

    jeckert@eckertcomms.com

    Hi Eduard-


    Once you give us the versions, we will try to reproduce your issue. However, based on the current information, this is most likely a product compatibility issue or defect between TIE and MAR. It might make the most sense to open a support ticket with McAfee to get it resolved quickly.


    Once you provide the version information, we will perform a few tests via the OpenDXL Python client to ensure it is not the component causing the issue.


    Thanks a lot,

    Chris

    Hi-


    That is a great question. Unfortunately, there is not a way to recover an existing service. However, if a request is sent to a service that no longer exists, it will automatically be removed (is removed prior to TTL being reached).


    So, one possible solution is that prior to registering your new service, you could ping (send a request) to any existing service instances (you can find these via a service query). This would ensure the services are still available (if they respond). Or, they will be removed from the fabric if the service is no longer available.


    One approach we have discussed for some time was an automatic retry in the broker itself. So, if a request is sent, and a service is found to no longer exist, the broker would automatically direct that request at the next available service. Hopefully, that is something we can get added in an upcoming release.


    Thanks,

    Chris

    Here is the documentation for the system transfer command:

    Code: System Transfer Remote Command
    1. system.transfer names epoServer
    2. Transfers systems to a different McAfee ePO server
    3. Requires System Tree edit permission
    4. Parameters: [names (param 1) | ids] - Supply the "names" with a comma-separated list of
    5. names/IP addresses or a comma-separated list of "IDs" to which assign the
    6. policy. epoServer (param 2) - Registered server name.


    Can you provide a sample of the source code you are using and the responses you are receiving?


    Thanks a lot,

    Chris

    The easiest way is to determine whether the client receiving the message was the original source of the message.


    I started to put together a flow that I was going to publish that simplified building such a bridge. Unfortunately, I haven't had time to fully document and post it yet.


    Chris

    Ok, I now understand the scenario.


    I don't think that would be possible with the native bridging support (due to the single parent (or hub) restriction). However, it would be possible if you set up normal connections to the fabrics that acted as a bridge. Let me know if you would like me to put something together to demonstrate this.


    Thanks,

    Chris

    Great to hear you got it working!


    As far as documentation, I agree it could definitely be clearer.


    Here is the page that covers provisioning the samples for the file transfer service:


    https://opendxl-community.gith…n/pydoc/sampleconfig.html


    Each of the samples also mentions provisioning as a prerequisite:


    https://opendxl-community.gith…oc/basicstoreexample.html

    https://opendxl-community.gith…/basicserviceexample.html


    How do you think this could be improved to be clearer? Any input would be greatly appreciated!


    Thanks,

    Chris

    Yes, this can be done. I will put together a similar scenario in the morning and post the configuration.


    Also, you don't necessarily have to use hubs.


    Based on your description you want something like this, correct?


    Code: Broker Fabric
    1. C----->B----->A
    2. / \ / \ / \
    3. C2 C3 B2 B3 A2 A3

    With hubs, it would look like the following:


    Code: Broker Fabric w/ Hubs
    1. [C hub]----------->[B hub]----------->[A hub]
    2. / \ / \ / \
    3. Other C Brokers Other B Brokers Other A brokers


    Each separate fabric has a hub. This allows the spokes (other brokers) to stay connected if one of the hub brokers fails. The "C hub" connects to the "B hub", which connects to the "A hub".


    Please let me know if this is an equivalent scenario, and I will put together the configuration and post it.


    Thanks a lot,

    Chris

    Hi Andrew-


    So, I would suggest following my steps exactly. I am wondering if the files themselves are being stored in the Docker container itself. In my steps, I am not using Docker at all to eliminate that possible scenario.


    Thanks,

    Chris