Writing Sandboxed Software

I’ve written a series of articles on various Linux sandboxing capabilities that developers can make use of to write their programs in a more secure fashion. If you’re interested, have a look.

Here’s a link to all of the articles:

Seccomp Filters: http://www.insanitybit.com/2014/09/08/3719/

Linux Capabilities: http://www.insanitybit.com/2014/09/08/sandboxing-linux-capabilities/

Chroot Sandbox: http://www.insanitybit.com/2014/09/08/sandboxing-chroot-sandbox/

Apparmor: http://www.insanitybit.com/2014/09/08/sandboxing-apparmor/

And here’s a link to the GitHub for SyslogParse, the program I use as a demonstration:

https://github.com/insanitybit/SyslogParser

Sandboxing: Conclusion

In total I’ve written five methods for sandboxing code. These are certainly not the only methods but they’re mostly simple to use, and they’re what I’ve personally used.

A large part of this sandboxing was only possible because I built the code to work this way. I split everything into privileged and unprivileged groups, and I determined my attack surface. By moving the sandboxing after the privileged code and before the attack surface I minimized risk of exploitation. Considering security before you write any code will make a very big difference.

One caveat here is that SyslogParse can no longer write files. What if, after creating rules for iptables and apparmor, I want to write them to files? It seems like I have to undo all of my sandboxing. But I don’t – there is a simple way to do this. All I need is to have SyslogParse spawned by another privileged process, and have that process get the output from SyslogParse, validate it, and then write that to a file.

One benefit of this “broker” process architecture is that you can actually move all of the privileged code out of SyslogParse. You can launch it in another user, in a chroot environment, and pass it a file descriptor or buffer from the privileged parent.

The downside is that the parent must remain root the entire time, and flaws in the parent could lead to it being exploited – attacks like this should be difficult as the broker could would be very small.

Hopefully others can read these articles and apply it to their own programs. If you build a program with what I’ve written in mind it’s very easy to write sandboxed software, especially with a broker architecture. You’ll make an attacker miserable if you can make use of all of this – their only real course of action is to attack the kernel, and thanks to seccomp you’ve made that a pain too.

Before you write your next project, think about how you can lock it down before you start writing code.

If you have anything to add to what I’ve written – suggestions, corrections, random thoughts – I’d be happy to read comments about it and update the articles.

Here’s a link to all of the articles:

Seccomp Filters: http://www.insanitybit.com/2014/09/08/3719/

Linux Capabilities: http://www.insanitybit.com/2014/09/08/sandboxing-linux-capabilities/

Chroot Sandbox: http://www.insanitybit.com/2014/09/08/sandboxing-chroot-sandbox/

Apparmor: http://www.insanitybit.com/2014/09/08/sandboxing-apparmor/

And here’s a link to the GitHub for SyslogParse:

https://github.com/insanitybit/SyslogParser

Sandboxing: Apparmor

Sandboxing: Apparmor

This is the fifth installment on a series of various sandboxing techniques that I’ve used in my own code to restrict an applications capabilities. You can find a shorter overview of these techniques here. This article will be discussing sandboxing with Apparmor.

Mandatory Access Control:

Mandatory Access Control (MAC), like Discretionary Access Control (DAC), is meant to define permissions for a program. Users and Groups are DAC. But what if you want to confine a program with full root? As discussed, root with full capabilities is quite dangerous – and in the case of SyslogParser quite a few of those capabilities are necessary.

Apparmor is a form of Mandatory Access Control implemented through the Linux Security Module hooks in the Linux kernel. MAC is “administrator” defined policy, and can confine even root applications.

Apparmor is a *bit* out of scope for this series, as it doesn’t actually involve any code, but it’s still relevant.

The Code:

While Apparmor itself doesn’t have code in SyslogParse, here’s the profile for the program.


# Last Modified: Wed Aug 13 18:57:15 2014
#include

/usr/bin/syslogparse{

/usr/bin/syslogparse mr,
/var/log/* mr,

/etc/ld.so.cache mr,

/sys/devices/system/cpu/online r,

/lib/@{multiarch}/libgcc_s.so* mr,
/lib/@{multiarch}/libc-*.so mr,
/lib/@{multiarch}/libm-*.so mr,
/lib/@{multiarch}/libpthread*.so mr,

/usr/lib/@{multiarch}/libseccomp.so* mr,
/usr/lib/@{multiarch}/libcap-ng*.so* mr,

/usr/lib/@{multiarch}/libstdc*.so* mr,

}

Apparmor is incredibly straight forward. There is a path, and then there is one or more letters. These letters stand for certain things.

r = read
m = map
w = write

All of this is pretty straight forward. SyslogParse gets the number of CPU cores from /sys/devices/system/cpu/online , so it needs “r” access.

It needs to read some libraries in order to function.

And that’s it. Sort of… apparmor on my system is, unfortunately, quite broken. The tools for enforcing/ complaining crash on me (I have a lot of weird profiles that I experiment with), which is actually why I started building SyslogParse. So this profile is a bit incomplete. It still needs some capabilities defined for chroot, setuid/setgid, and possibly more file access.

Conclusion:

When enabled an Apparmor profile will begin enforcing policy as soon as the process begins. That means that, even if running as root, an attacker is always confined to those files defined in the profile. Apparmor is quite powerful, and combined with the other sandboxing techniques used it’s a very nice reinforcement – writing to the chroot, for example, is denied throughout the process by both DAC and MAC now.

Apparmor is, unfortunately, only available on certain distributions.

Next Up: Final Conclusion

Sandboxing: Chroot Sandbox

Sandboxing: Limited Users

This is the fourth installment on a series of various sandboxing techniques that I’ve used in my own code to restrict an applications capabilities. You can find a shorter overview of these techniques here. This article will be discussing sandboxing a program using Limited Users.

Users and Groups:

Linux Discretionary Access Control works by separating and grouping applications into ‘users’ and ‘groups’. A process in user A is, in terms of DAC, isolated from a process in user B.

There’s also user 0, the root user, which is a privileged user account.

Only a program with root, or with CAP_SETUID / CAP_SETGID can manipulate its own UID/GID. In the case of SyslogParse, we have root, and we definitely want to lose it when we can.

So, after getting the file handles we need, here’s the code for dropping to a limited user account (if you’ve read the previous articles this happens right after the chroot).


if (setgid(65534) != 0)
err(0, "setgid failed.");
if (setuid(65534) != 0)
err(0, "setuid failed.");

Very simple. So here’s a simple explanation.


if (setgid(65534) != 0)
err(0, "setuid failed.");

setgid(65534) sets the GID to 65534. This is the “nobody” group on my system. Nobody is an unprivileged user often used by programs wanting to drop privileges. If 65534 doesn’t exist, all the better – dropping to a GID that doesn’t exist is great.


if (setuid(65534) != 0)
err(0, "setgid failed.");

setuid(65534) is changing the user to 65534, which, as above, is the nobody user. Same as before, if the user doesn’t exist, that’s dandy.

Conclusion:

Dropping privileges is a hugely beneficial thing to do. By separating the code into a “privileged stuff done all at once, then never again” you can drop privileges before doing anything dangerous, and there goes an attackers ability to escalate.

Dropping root privileges is incredibly important. The attack surface and amount of post-exploitation work an attacker can do shrinks drastically.

In the case of SyslogParse, as any attacks would be for local escalation (it does no networking), an attacker would probably lose privileges by exploiting it if going from any normal compromised process. At this stage they are in a chroot with no read or write access, running in an unprivileged user/ group with no capabilities, they have access to 22 system calls and some very nice to have calls, such as read() are denied, and their only chance for getting a few capabilities is by exploiting a few lines of code that involve opening a file.

I was going to have the next section be on rlimit, but it’s really not important and also not viable unless you’ve built the application from the bottom up to never write to a file, which will typically involve a broker’d architecture.

Next Up: Apparmor

Sandboxing: Chroot Sandbox

Sandboxing: Chroot Sandbox

This is the third installment on a series of various sandboxing techniques that I’ve used in my own code to restrict an applications capabilities. You can find a shorter overview of these techniques here. This article will be discussing Chroot sandboxing.

Intro To Chroot:

If you’ve been on Linux for a little while you may have already heard of a chroot. Maybe some service you use “chroots” itself. You may have also heard that chroot’ing isn’t great for security, or maybe even that they’re super easy to break out of. Whoever said that isn’t wrong, chroot environments can be great for confining some things and really awful for confining others.

Chroot is, simply, “change root”. The Linux file system has a root, it’s “/” – everything is an offset from this root node. But with chroot you can tell a process that “/” is actually somewhere else. Now, as far as that process knows, the entire filesystem begins somewhere else.

There are two requirements for a process to be able to break out of a chroot environment:

1) The ability to call chroot() again (this requires root, or CAP_SYS_CHROOT)

2) The ability to write to the chroot environment.

So, as soon as you chroot, your process should drop privileges and lose the CAP_SYS_CHROOT capability. If you can remove write access, all the better.

In the case of SyslogParse I did both.

The Code:

mkdir("/tmp/syslogparse/", 400);

chdir("/tmp/syslogparse/");

if(chroot("/tmp/syslogparse/") != 0)
err(0, "chroot failed");

Line by line:


mkdir("/tmp/syslogparse/", 400);

mkdir() is a system call that makes a directory at the specified path, with the specified permissions.

In this case the code is creating a directory /tmp/syslogparse/ and only root can read the folder, and no one can write to it.


chdir("/tmp/syslogparse/");

We move our working directory to the folder we’ve just created.


if(chroot("/tmp/syslogparse/") != 0)
err(0, "chroot failed");

chroot changes the root directory to the folder we’ve just created. Now the program, as far as it knows, is at “/” – the root directory. It only sees an empty file system, none of which it can write to.

The next step here is to drop permissions with setUID() and setGID(), which I’ll be going over in the next article.

Conclusion:

At this point the chroot *can* be broken out of. Nothing is stopping this program from simply changing the mode of the folder to allow writing to it. If, however, you drop privileges (again, see next article), you’ll be in a chroot that can not be bypassed by any design flaws in the chroot itself.

The benefits of being in a no-write chroot are quite nice. The process can’t write any files, which means it can’t open any pipes to other processes – no communication.

With a Grsecurity kernel (some distros package these particular chroot modifications) there’s a host of other restrictions applied, chroot sort of acts as a separate namespace/ user, and communicating outside of the chroot in any new way is denied. The process is isolated more strictly.

It’s a very nice way to sandbox an application, and it’s fairly simple, though not suitable for all applications.

Next Up: Limited Users

Sandboxing: Linux Capabilities

This is the second installment on a series of various sandboxing techniques that I’ve used in my own code to restrict an applications capabilities. You can find a shorter overview of these techniques here. This article will be discussing Linux Capabilities.

Intro To Linux Capabilities:

On Linux you’re likely familiar with the root user. Root is the ‘admin’ account of the system, it has privileges that other processes don’t. But, what you may not have known, is that those privileges given to root are actually enumerable and defined. For example, root has the capability CAP_SYS_CHROOT, which is what allows it to call chroot().

Let’s say a program needs root, but only because it calls chroot at some point. Instead of giving all of the privileges except CAP_SYS_CHROOT.

So, if your program has to run as root (as mine does), you can actually drop some of your root privileges while maintaining others. How effective is this? Jump down to the conclusion below to see – hint: it can go between great and awful.

The Code: (includes should have a # in front, but WP is mean)

include include

capng_clear(CAPNG_SELECT_BOTH);
capng_updatev(CAPNG_ADD, (capng_type_t)(CAPNG_EFFECTIVE | CAPNG_PERMITTED), CAP_SETUID, CAP_SETGID, CAP_SYS_CHROOT, CAP_DAC_READ_SEARCH, -1);
capng_apply(CAPNG_SELECT_BOTH);

Let’s break this down ~line by ~line.


include include

This includes the headers for linux capabilities (so that you can refer to them as their type) and cap-ng, the library we’ll be using to actually drop privileges.


capng_clear(CAPNG_SELECT_BOTH);

This line will clear all privileges. If you were to apply this, you’d have essentially dropped all root privileges.


capng_updatev(CAPNG_ADD, (capng_type_t)(CAPNG_EFFECTIVE | CAPNG_PERMITTED), CAP_SETUID, CAP_SETGID, CAP_SYS_CHROOT, CAP_DAC_READ_SEARCH, -1);

This line is where we state which capabilities we’d like. After the capng_clear() we have none, but the program does need a few.

The first two parameters are effectively saying to add these rules.

The third, fourth, and fifth parameters are the defined capabilities to allow.

The last parameter is a -1, which lets capng know that your list of capabilities is terminated.


capng_apply(CAPNG_SELECT_BOTH);

And now, with this call, the rules are applied. Only these capabilities are given to the program… “only”.

Conclusion:
This was really easy to do. Three lines of code and a large number of capabilities are gone. But, what’s left?

CAP_SETUID/ CAP_SETGID : Quite dangerous, as it means you can interact with processes of other UID/GID’s by simply making your UID/GID the same as theirs.

CAP_SYS_CHROOT : Not as scary, you can chroot, and if you retain the ability to chroot you can then break out of that.

CAP_DAC_READ_SEARCH : You can read all files that root can read. Password files, sensitive files, whatever. All yours to read.

So in a lot of ways you’re just dropping from root…. to root. You’re still quite powerful and dangerous if you drop these capabilities, it’s not a very large barrier. An attacker who gains the above privileges still gains quite a lot. But, in the case of SyslogParse, all capabilities are dropped eventually.

The nice thing about capabilities is you can do it as soon as the program starts. After you’ve gotten your file descriptors and all that, you can go ahead and start real sandboxing, then do the actual dangerous stuff. In this case, I had to give a lot of scary permissions. But for someone else, maybe all they need is to bind port 80, and in that case you just give CAP_NET_BIND_SERVICE, drop everything else, and that’s pretty nice.

It honestly feels like a “Well, it’s better than giving it full root” in this case, which is bittersweet. It still pretty much feels like full root. But uh, hey, it’s better than giving it full root.

Sandboxing: Seccomp Filters

COG-605 dumps | COG-605 practice exam | COG-605 real exam questions | Security Through Boredom

COG-605 dumps are best to memorize and practice with VCE exam simulator to pass exam at first attempt with good marks. 100% pass guaranteed - Security Through Boredom

Pass4sure COG-605 dumps | Killexams.com COG-605 actual questions | http://www.insanitybit.com/


Killexams.com COG-605 Dumps and actual Questions

100% actual Questions - Exam Pass Guarantee with high Marks - Just Memorize the Answers



COG-605 exam Dumps Source : IBM Cognos 10 Controller Developer

Test Code : COG-605
Test designation : IBM Cognos 10 Controller Developer
Vendor designation : IBM
: 94 actual Questions

COG-605 question bank that works!
Preparing for COG-605 books can exist a intricate job and 9 out of ten chances are that youll fail if you finish it with nonexistent commandeer guidance. Thats in which satisfactory COG-605 engage is available in! It provides you with efficient and groovy data that no longer simplest enhances your practise but additionally gives you a cleanly carve threat of passing your COG-605 download and stirring into any university without any melancholy. I prepared through this awesome software and I scored 42 marks out of 50. I can assure you that its going to never assist you to down!


a passage to do together for COG-605 examination?
The exercise exam is incredible, I passed COG-605 paper with a marks of one hundred percentage. nicely worth the cost. I may exist back for my subsequent certification. initially permit me provide you with a huge thanks for giving me prep dumps for COG-605 exam. It was indeed useful for the coaching of tests and additionally clearing it. You wont believe that i got no longer a unmarried solution incorrect !!!Such comprehensive exam preparatory material are top class passage to attain high in test.


I want modern and updated dumps of COG-605 examination.
killexams.com presents dependable IT exam stuff, i absorb been the utilize of them for years. This exam is no exception: I passed COG-605 the utilize of killexams.com questions/answers and exam simulator. everything human beings teach is right: the questions are actual, this is a completely trustworthy braindump, definitely valid. And i absorb most efficient heard suitable matters about their customer service, however in my sentiment I by no means had issues that could lead me to palpate them inside the first vicinity. simply high-quality.


All actual test questions ultra-modern COG-605 examination! Are you kidding?
The rehearse exam is tremendous, I handed COG-605 paper with a score of 100 percent. Well well worth the cost. I may exist returned for my next certification. First of total permit me provide you with a expansive thanks for giving me prep dumps for COG-605 exam. It become certainly helpful for the preparation of test and too clearing it. You wont disagree with that i were given no longer a unmarried solution incorrect !!!Such comprehensive exam preparatory material are grotesque manner to score extreme in checks.


proper location to rep COG-605 actual buy a seek at question paper.
Terrific stuff for COG-605 exam which has actually helped me pass. i absorb been dreaming about the COG-605 profession for a while, but might too want to by no means accomplish time to study and actually rep licensed. As a whole lot as i was tired of books and publications, I couldnt accomplish time and simply test. The ones COG-605 made exam training definitely realistic. I even managed to test in my vehicle while the utilize of to work. The handy layout, and yes, the sorting out engine is as top because the net web page claims it is and the accurate COG-605 questions absorb helped me rep my dream certification.


i discovered a first rate source for COG-605 dumps
Though I even absorb sufficient legacy and revel in in IT, I expected the COG-605 exam to exist simpler. killexams.com has stored my time and money, with out those QAs I might absorb failed the COG-605 exam. I got burdened for few questions, so I nearly had to guess, however this is my fault. I requisite to absorb memorized rightly and pay attention the questions better. Its loyal to realize that I passed the COG-605 exam.


real COG-605 questions and redress answers! It warrant the charge.
My view of the COG-605 test fee usher changed into horrific as I normally wanted to absorb the schooling thru a test approach in a category margin and for that I joined precise schooling however those total appeared a fake ingredient for me and that i cease them perquisite away. Then I did the hunt and in the discontinuance modified my considering the COG-605 check samples and that i commenced with the equal from killexams. It surely gave me the fine scores in the exam and im blissful to absorb that.


examination questions are modified, wherein am i able to find current questions and answers?
I passed both the COG-605 first try itself with 80% and 73% resp. Thanks a lot for your help. The question bank really helped. I am thankful to killexams.com for helping a lot with so many papers with solutions to toil on if not understood. They were extremely useful. Thankyou.


Do you want latest dumps of COG-605 examination, it's far perquisite vicinity?
Being an under average pupil, I had been given frightened of the COG-605 exam as topics seemed very difficult to me. Butpassing the test become a requisite as I had to trade the undertaking badly. Searched for an effortless usher and got one with the dumps. It helped me solution total multiple nature questions in 2 hundred minutes and skip efficiently. What an exquisitequery & solutions, thoughts dumps! Satisfied to rep hold of two gives from well-known teams with good-looking bundle. I recommend most efficient killexams.com


I had no time to seek at COG-605 books and training!
Great!, I arrogant to exist trained with your COG-605 QA and software. Your software helped me a lot in preparing my IBM exams.


IBM IBM Cognos 10 Controller

IBM launches Cognos 10 | killexams.com actual Questions and Pass4sure dumps

IBM has introduced the launch of the latest version of its company intelligence application, IBM Cognos 10.

The newest replace, which IBM says is the most colossal due to the fact that it bought Cognos, goals to buy analytics to cellular gadgets and to interject a social networking approach to analytics, in order to encourage better collaboration.

Cognos 10 has a brand current materialize and consider, which IBM says mirrors individuals's daily utilize of technology, and too comprise loyal time analytics, and the capacity to deliver analytics to cellular instruments akin to iPhone and BlackBerry handsets.

The utility additionally extends the reporting of records, to existing analytics in a simpler to champion in humor format, and to accomplish analytics purchasable to the broader organisation, increasing the variety of stakeholders that may utilize commerce intelligence in the resolution making technique.

"The Cognos 10 software gives you a completely current consumer event, which enables clients to achieve collective intelligence by passage of connecting with others, sharing insights and setting up determination networks, therefore redefining the traditional strategies of interplay companies utilize with assistance and the manner clients collaborate with their peers," stated Bashar Kilani, Bashar Kilani, enterprise Unit government, IBM application neighborhood, IBM core East.


Cognos 10 and how it Represents the IBM Cloud method | killexams.com actual Questions and Pass4sure dumps

IBM’s cloud computing method is clear-cut to a degree with Cognos 10, the newest liberate of the company intelligence platform.

As a fragment of Cognos 10, IBM is delivering a cloud-based mostly provider they designation Cognos 10 within the Cloud.

It’s a hosted service for businesses that requisite to switch its license to the cloud or entirely host online.

The method gives some insight into the IBM software neighborhood’s approach to cloud computing.

We espy three distinctions to the Cognos strategy and how it displays on IBM’s method.

Hosted, no longer SaaS

The carrier has the identical elements that incorporates Cognos 10, arguably the spotlight of IBM’s suggestions on demand conference. It features predictive analytics and social collaboration capabilities via its integration with Lotus Connections.

Cognos 10 within the Cloud isn't within the vein of a traditonal SaaS service. You pay a license permeate for internet hosting your license. purchasers might too switch the license from on-premise to the IBM Cloud. consumers can too also opt to buy a hosted license with out investing in the on-premise service.

Streaming records from the Cloud

Cognos 10 is a hefty duty analytics engine. SPSS is baked in. It pulls in statistics from the cloud, including unstructured data. clients can utilize third-birthday party data to rep a archaic evaluation, espy a real-time trek or finish “what if,” evaluation.

less About Cloud, more About birth

It’s questionable if there is demand for a pure cloud computing provider. companies absorb abysmal investments in data facilities however they are seeking the passage to extend to a cloud atmosphere. That’s a fashion they espy plenty from SAP, VMware, Microsoft and a number of others.

IBM says it’s method with Cognos 10 is much less about the cloud than start. customers covet a spot to position their software license. That’s the birth IBM is making with Cognos 10 in the Cloud.

we've questions in regards to the IBM strategy. it’s just a wee a fragment of IBM’s cloud providing nonetheless it’s illustrative of a bigger approach to accomplish utilize of the cloud as an alternative start mannequin for typical software offerings.


IBM Cognos main the style for great records analytics | killexams.com actual Questions and Pass4sure dumps

Introduction

IBM has dissimilar application commerce intelligence items together with Cognos Analytics (previously Cognos commerce Intelligence), Cognos TM1, Cognos specific, and Cognos Disclosure administration. IBM Cognos commerce Intelligence front-end can connect with SAP for HANA integration from IBM Cognos 10.2.1.9 onwards through JDBC records source connectivity (IBM, n.d.). based in 1966, Jabil Circuit, Inc is likely one of the greatest electronic components manufacturing capabilities companies masking a large-range of enterprise areas equivalent to deliver chain administration, electronic circuits, design engineering, logistics, telecommunications, and semiconductor add-ons for computing. GE is among the largest clients for Jabil Circuit (Universe, n.d.).

heritage of the problem

firms want closing mile facts analytics on their economic operations. After performing procurement of substances, creation of accomplished items, inventory management, supply chain administration, creating income orders, deliveries, transportation administration, superior shipping notifications, delivery, dealing with devices administration, invoice creation, ultimately the company has to function the fiscal shut taking the refined information from the disparate records sources of the aforementioned enterprise processes. In an commerce aid Planning device, total the above company processes are unified in a lone database with a structured layout. youngsters, organizations conducting the company transactions in assorted database techniques and spreadsheets will requisite to expend a few hours to assemble the records to consolidate the customary ledger postings, money circulation, and funds stream (Srilu, 2013).

The records that comes into excel spreadsheets may additionally now not absorb validation system to examine the enter of the values apart from checking numeric, alphanumeric mixtures. There isn't any ordinary pre-defined configuration installation in excel Spreadsheet migrated from another device. this may create problems for a corporation to consolidate and migrate the records from excel spreadsheets into monetary statements for every month conclusion. excel spreadsheets can drastically behind down the enterprise multiply and the revenues of the commercial enterprise. If the commerce can't visualize the profits and regional breakdown of the key efficiency warning signs, they are not able to forecast the multiply blueprint for the subsequent quarter or subsequent month. this can influence supply chain operations, superior planning optimization, and corporation network collaboration, organisation community planning. in the conclusion, the organization will undergo losses. Jabil changed into stricken by these fundamental problems to shut the finance of the corporations on time (Thomson, 2012).

Key solutions

Jabil implemented IBM company Analytics, IBM Cognos enterprise Intelligence, IBM Cognos company standpoint, IBM Cognos Controller, and IBM Cognos TM1 to beat the financial nearby conundrums. IBM enterprise Analytics consists of a bevy of analytics solutions for organizations.

clever and Intuitive Analytics

Jabil implemented IBM Cognos Analytics, the commandeer arm of IBM commerce Analytics company Intelligence Analytics. The retort from Cognos Analytics presents the financial enterprise teams to create the analytics stories on their own. they can access the statistics points from BI Analytics through diverse sources, and with gauge drag and drop innovations they can construct the self-serviceable reports. enterprise teams can access IBM Cognos even when they are cellular or operating browser-primarily based rig on the web. Streamlined workflow gives quite a few notifications for the service provider groups to motion their specific initiatives (IBM, n.d.).

commercial enterprise Key performance warning signs

IBM commercial enterprise efficiency management offers the insights to measure the efficiency of the commercial enterprise in entirety by means of bridging the gaps discovered between the supply chain administration operations and the economic closing of the company books. It provides alternate options to build superior planning for the corporation to incorporate ideas to assignment the trends of the facts into the longer term by passage of gazing the ancient patterns of the information in every enterprise group. This gives forecasting for supply chain operations to manufacture the products into the market for numerous segments and valued clientele. corporations that may observe the premonitory watch signals notifications and traits can set the tempo for enterprise boom for each quarter to exhibit the records insights into revenues. EPM too can panoply screen the project plans developed with selected economic budgets for a bevy of a portfolio of mission options spanning a number of items to set up a knowledge-pushed company. The scorecards of EPM additionally indicates the areas the site the felony and regulatory necessities for the agencies in making ready the fiscal statements (IBM, n.d.).

Definitive Analytics

IBM Prescriptive Analytics performs prognosis on the agency ordinary efficiency with strategic management strategies. The diagnosis ends up in offering counseled alternate options to explain, buy a seek at, report, and act. commerce workflow optimization drives automation of some choices. C-stage executives can manage the business-oriented decisions. These choices can probably approach up on reviewing the brand loyalty of the consumers regarding specific items for either launching a current product or multiply an present product in the organization. The prescriptive evaluation can too tap into the localization of the events equipped by the company catering a bevy of local markets (IBM, n.d.).

Presaging Analytics

IBM commerce Intelligence offers predictive analytics for performing ultimate mile data mining analytics through wrangling the records with exploratory options to buy note the commerce perspectives from a variety of key performance indicators of commerce methods spawning from buying the uncooked substances via fiscal shut. organisations want insights to devise as a minimum 18 months forward of their superior planning optimization for supply chain administration processes. This may too exist executed through several forecasting fashions and utilized statistical methods on the statistics and glean the statistics via textual content-primarily based analytics to create trend-based fashions for the future. This helps to forecast and forecast the variety of products to launch on the commercial shelves for each retailer available in the market with who Jabil is conducting the enterprise transactions. basically, Jabil manufactures digital components to bring together a number of digital contraptions, telecommunication, and community gadget. This requires granularity of the market tendencies and the predictive analytics too can hook up to connect to R language for leveraging any extra statistical programs. IBM’s Apache Spark can function the analytics in-memory in distinction to Apache Hadoop that puts IBM ahead of the pack on file-primarily based techniques. Integration of Apache Spark with IBM predictive analytics and R can raise the rig extra (IBM, n.d.).

Governance, Compliance, and risk administration

corporations performing groups throughout the globe requisite to meet regulatory compliance requirements when buying and selling with different international locations. The dangers can approach up from the tax legal guidelines or customs tasks levied by passage of other nations throughout the globe. To steadiness the change hazards, the company transactions carried out within the commerce aid planning system or through multiple database methods should exist audited. IBM has the analytics options for managing the chance. The risk administration analytics will scan through the database of the corporation and discover the dangers involved via actual-time analytics for understanding the ever-expanding hazards within the transactions (IBM, n.d.).

notwithstanding, Jabil carried out not total the above options. youngsters, a majority of the IBM Analytics options absorb been implemented. The benefits derived through Jabil via IBM Analytics solutions offered astounding insights to obtain the operational excellence for fiscal nearby and manufacturing vegetation to race their company correctly. The solution can too exist utilized to a few different industries that are within the manufacturing of semiconductor, high-tech, telecommunications, storage instruments, and community equipment. The manufacturing technique comprises a particular method model to do together the built-in design of the circuits and chips. The grasp information management constructed via Jabil via their analytics, and commerce performance administration to panoply screen the principal thing performance indications can exist utilized to a considerable number of different industries which are under an identical class. The success of Jabil can exist a potential chance for relaxation of the industries in that angle to observe in line with the necessities evaluation and implementation of Jabil.

References

IBM (n.d.). Cognos Analytics. Retrieved November 9, 2015, from http://www.ibm.com/analytics/us/en/technology/items/cognos-analytics/

IBM (n.d.). Cognos enterprise Intelligence 10.2.1. Retrieved December 28, 2015, from http://www-969.ibm.com/utility/studies/compatibility/clarity-stories/file/html/prereqsForProduct?deliverableId=1330380859450#!

IBM (n.d.). enterprise efficiency management. Retrieved November 9, 2015, from http://www-03.ibm.com/software/items/en/class/efficiency-administration

IBM (n.d.). Predictive Analytics. Retrieved November 10, 2015, from http://www.ibm.com/analytics/us/en/technology/predictive-analytics/

IBM (n.d.). Prescriptive analytics. Retrieved November 9, 2015, from http://www-03.ibm.com/application/items/en/class/prescriptive-analytics

IBM (n.d.). risk administration. Retrieved November 10, 2015, from http://www.ibm.com/analytics/us/en/enterprise/possibility-administration.html

Srilu (2013). SAP modules overview and commerce strategies. Retrieved November 9, 2015, from http://www.slideshare.net/srilu999/sap-modules-overview-and-business-methods

Thomson, S. (2012). IBM Analytics offers Jabil superior perception Into monetary performance. Retrieved November 9, 2015, from http://www-03.ibm.com/utility/businesscasestudies/us/en/corp?synkey=M525334P42483U27

Universe, F. (n.d.). Jabil Circuit, Inc. history. Retrieved November 9, 2015, from http://www.fundinguniverse.com/enterprise-histories/jabil-circuit-inc-background/


While it is very arduous assignment to pick trustworthy certification questions / answers resources with respect to review, reputation and validity because people rep ripoff due to choosing wrong service. Killexams.com accomplish it sure to serve its clients best to its resources with respect to exam dumps update and validity. Most of other's ripoff report complaint clients approach to us for the brain dumps and pass their exams happily and easily. They never compromise on their review, reputation and quality because killexams review, killexams reputation and killexams client self-possession is principal to us. Specially they buy supervision of killexams.com review, killexams.com reputation, killexams.com ripoff report complaint, killexams.com trust, killexams.com validity, killexams.com report and killexams.com scam. If you espy any fake report posted by their competitors with the designation killexams ripoff report complaint internet, killexams.com ripoff report, killexams.com scam, killexams.com complaint or something fancy this, just champion in humor that there are always atrocious people damaging reputation of helpful services due to their benefits. There are thousands of satisfied customers that pass their exams using killexams.com brain dumps, killexams PDF questions, killexams rehearse questions, killexams exam simulator. Visit Killexams.com, their sample questions and sample brain dumps, their exam simulator and you will definitely know that killexams.com is the best brain dumps site.

Back to Braindumps Menu


M2090-733 actual questions | BH0-008 questions answers | SC0-502 test prep | ISEE cheat sheets | 920-548 free pdf | DEV-401 rehearse questions | 70-745 exam prep | 1D0-610 exam questions | C2020-615 brain dumps | P8060-017 examcollection | 1V0-602 study guide | SEC504 actual questions | 190-951 dumps | 1Z0-530 rehearse test | LOT-804 questions and answers | 250-252 test prep | 000-559 dump | 3X0-201 test questions | MSC-122 study guide | CRFA braindumps |


Looking for COG-605 exam dumps that works in actual exam?
killexams.com exam prep material gives total of you that you absorb to pass COG-605 exam. Their IBM COG-605 dumps mediate of questions that are precisely identical as actual exam. high gauge and impetus for the COG-605 Exam. They at killexams guarantees your accomplishment in COG-605 exam with their braindumps.

The only issue that's in any manner very principal perquisite here is passing the COG-605 - IBM Cognos 10 Controller Developer test. total that you requisite will exist a high score of IBM COG-605 exam. The simply a widowed facet you wish to try to is downloading braindumps of COG-605 exam confine humor directs currently. they are not letting you down as they already guaranteed the success. The professionals likewise champion step with the most up and returning test with the intent to waive the additional an area of updated dumps. One twelvemonth slack rep perquisite of entry to possess the aptitude to them via the date of purchase. every one could benifit expense of the COG-605 exam dumps through killexams.com at an occasional value. often there will exist a markdown for each cadaver all. Are you looking for IBM COG-605 Dumps of actual questions for the IBM Cognos 10 Controller Developer test prep? they proffer most updated and nice COG-605 Dumps. Detail is at http://killexams.com/pass4sure/exam-detail/COG-605. they absorb got compiled an information of COG-605 Dumps from actual tests thus on allow you to organize and pass COG-605 exam on the first attempt. simply memorize their and relax. you will pass the test. killexams.com Discount Coupons and Promo Codes are as under; WC2017 : 60% Discount Coupon for total exams on website PROF17 : 10% Discount Coupon for Orders additional than $69 DEAL17 : 15% Discount Coupon for Orders larger than $99 SEPSPECIAL : 10% Special Discount Coupon for total Orders

The most gauge approach to rep achievement in the IBM COG-605 exam is that you should accomplish solid introductory materials. They ensure that killexams.com is the greatest direct pathway closer to Implementing IBM IBM Cognos 10 Controller Developer authentication. You can exist successful with replete self conviction. You can espy free inquiries at killexams.com sooner than you buy the COG-605 exam items. Their mimicked evaluations are in two or three conclusion fancy the genuine exam design. The inquiries and answers made by the guaranteed specialists. They proffer you with the esteem of taking the genuine exam. 100% guarantee to walkaway through the COG-605 actual test.

killexams.com IBM Certification exam courses are setup by method for IT masters. Bunches of understudies absorb been grumbling that excessively numerous inquiries in such a considerable measure of activity tests and exam courses, and they're simply exhausted to discover the cash for any more prominent. Seeing killexams.com experts instructional course this entire configuration in the meantime as in any case ensure that every one the data is incorporated after profound research and assessment. Everything is to accomplish accommodation for competitors on their street to certification.

We absorb Tested and Approved COG-605 Exams. killexams.com gives the most perquisite and most recent IT exam materials which almost accommodate total data references. With the usher of their COG-605 brain dumps, you don't requisite to squander your opening on examining greater fragment of reference books and basically requisite to burn through 10-20 hours to ace their COG-605 actual issues and replies. Furthermore, they appoint you with PDF Version and Software Version exam inquiries and answers. For Software Version materials, Its introduced to give the candidates recreate the IBM COG-605 exam in a genuine domain.

We proffer free supplant. Inside legitimacy length, if COG-605 brain dumps that you absorb bought updated, they will illuminate you with the usher of email to down load best in class model of . if you don't pass your IBM IBM Cognos 10 Controller Developer exam, They will give you replete refund. You requisite to transport the verified imitation of your COG-605 exam record card to us. Subsequent to affirming, they will quick give you replete REFUND.

killexams.com Huge Discount Coupons and Promo Codes are as under;
WC2017: 60% Discount Coupon for total exams on website
PROF17: 10% Discount Coupon for Orders greater than $69
DEAL17: 15% Discount Coupon for Orders greater than $99
DECSPECIAL: 10% Special Discount Coupon for total Orders


On the off chance that you set up together for the IBM COG-605 exam the utilization of their experimenting with engine. It is effortless to prevail for total certifications in the principal endeavor. You don't must accommodate to total dumps or any free downpour/rapidshare total stuff. They proffer free demo of each IT Certification Dumps. You can try out the interface, question decent and ease of utilize of their activity appraisals before settling on a election to purchase.

COG-605 Practice Test | COG-605 examcollection | COG-605 VCE | COG-605 study guide | COG-605 practice exam | COG-605 cram


Killexams 70-486 free pdf download | Killexams 9A0-068 free pdf | Killexams 000-M241 dumps | Killexams HCE-5710 test prep | Killexams 7130X study guide | Killexams HP0-A03 braindumps | Killexams SC0-451 free pdf | Killexams 1D0-437 brain dumps | Killexams 9L0-806 questions and answers | Killexams 1D0-635 test prep | Killexams HP2-E37 test prep | Killexams ACF-CCP dumps questions | Killexams 000-220 mock exam | Killexams C2140-047 dump | Killexams 000-347 braindumps | Killexams 920-141 rehearse test | Killexams HPE2-Z38 free pdf | Killexams RDN exam questions | Killexams EX0-102 braindumps | Killexams 640-878 actual questions |


killexams.com huge List of Exam Braindumps

View Complete list of Killexams.com Brain dumps


Killexams HP0-J39 dumps | Killexams ZF-100-500 study guide | Killexams 1Z1-403 pdf download | Killexams A2090-463 free pdf download | Killexams 1Z0-147 free pdf | Killexams JN0-410 exam questions | Killexams 250-371 study guide | Killexams 650-752 actual questions | Killexams VMCE_V8 actual questions | Killexams HP0-Y28 rehearse exam | Killexams COG-300 cheat sheets | Killexams 70-486 rehearse test | Killexams ST0-118 cram | Killexams 050-644 questions and answers | Killexams 000-057 test prep | Killexams 500-285 dumps questions | Killexams HP0-794 rehearse Test | Killexams ASC-090 exam prep | Killexams 000-017 braindumps | Killexams 70-552-VB test questions |


IBM Cognos 10 Controller Developer

Pass 4 sure COG-605 dumps | Killexams.com COG-605 actual questions | http://www.insanitybit.com/

Ten current Features for Report Users in IBM Cognos 10 Report Studio | killexams.com actual questions and Pass4sure dumps

Instead of just focusing on the major features of IBM Cognos 10 Report Studio, Roger Johnson, coauthor of IBM Cognos 10 Report Studio: Practical Examples, looks at some lesser-known tools and kick properties to further enhance your Report Studio projects. From the author of 

The biggest additions to IBM Cognos Report Studio in version 10 absorb been vigorous Reports and Statistics objects. These two current ways of presenting data provide many current options for report authors to create reports that match the analytical needs of the user community. Entire classes absorb been created to focus on the creation of reports using these formats.

Other capabilities in IBM Cognos 10 Report Studio comprise the integration of external data sources and the aptitude to save Report Studio reports to exist used directly by commerce Insight Advanced authors. Using external data sources in Report Studio has been a customer request since IBM Cognos tools moved to a web-based architecture. This enhancement gives the report consumer current opportunities to process just the information needed for a specific situation. By using a common report definition between applications, users who aren't accustomed to complicated reports can receive assistance to build reports that absorb features beyond those normally available to commerce Insight Advanced users.

While the features I've discussed to this point are promoted as key features, other useful options in version 10 will multiply the effectiveness of reports to your users. Some of these features were previously available through complicated programming by the report developer, but IBM Cognos commerce Intelligence now makes this job much easier. As a report writer and instructor who can esteem the aptitude to create report designs designed to inform a story, I want to highlight 10 of these current features.

Alternative Text

The Alternative Text property has been added to graphical objects, such as images and charts, to allow screen readers to translate the text (see pattern 1). Another aspect of this feature that's applied in various properties is to provide localized text without using conditional formatting.

Colored Regions, Plot area Fill, Material Effects

The Colored Regions, Plot area Fill, and Material Effects properties allow charts to absorb more creative designs in order to enhance the overall presentation of the charts. Mixed with other chart presentation options, these features alleviate report developers to give current polish to the presentation of content. pattern 2 shows the additional gradient options that are now available, along with the current properties for the charts.

New Gradient Fill Types

Moving beyond simple linear gradients, some current fill types provide more choices for enhanced presentations. Backgrounds behind pages, objects, and selected areas can now panoply a number of gradient designs. The fill types comprise rectangular frames and embedded circular gradients, with many parameters to customize the blending of colors (see pattern 3).

Chart Combinations (Primary Axis, Secondary Axis, Primary Bottom, Secondary Bottom)

IBM Cognos 8 commerce Insight had the aptitude to set Y1 and Y2 axes. Now combination charts can exist stacked with two more axes for better analysis of related numbers. These options can exist integrated into dashboard design to enhance the presentation of related measures. pattern 4 shows a chart with the additional axes selected.

Query Calculations in Query Explorer supersede Calculated Member, Calculated Measure, Set Expression, and Intersection (tuple)

In IBM Cognos 8 Report Studio, different objects were created for each of the dimensional functions. Now they're total bundled into the Query Calculations (see pattern 5). This option simplifies the toolbox for report authors, while emphasizing that dimensional queries can absorb enhanced calculations.

Summarize wee Slices and Exploded Slices

With pie charts, two improvements comprise summarizing wee slices and exploding slices. These two options greatly help the presentation of pie charts to emphasize the most principal information (see pattern 6). In my classes, these options absorb been requested for years, and now they're delivered in this release.

Trendlines, Category Baselines, and Numeric Baselines

Report Studio functions absorb now improved on the predictive aptitude behind charts, without having to utilize external functions to cipher the numbers (see pattern 7).

Series Color Synchronization

With the changes made to enhance series, this feature allows report writers to simplify the legends when the sequence information is repeated across the combinations. By changing the sequence Color property to Match (see the short arrow in pattern 8), the legend shows the different product lines and the nested measures differently (indicated by the longer arrow), increasing the effectiveness of the presentation.

Custom Properties for Prompts

Several current properties absorb been added to enhance the prompt options. These changes allow report developers to create customized prompt options without the additional JavaScript programming required in earlier versions. Most of the prebuilt text labels with these prompts can exist customized to meet the needs of individual reports (see pattern 9).

Dimensional Set Definitions

As a welcome addition to dimensional functions, the Set Definition option (see pattern 10) allows report authors to create complicated subsets that are presented graphically. This feature can accomplish troubleshooting complicated sets much easier by providing a more modular approach to set development. This was the best current property for me, since it simplifies the evolution of sets that had to exist replete of nested functions.


Amazon takes on commerce intelligence with QuickSight | killexams.com actual questions and Pass4sure dumps

Amazon Web Services, already a expansive player in databases, is launching a current commerce intelligence tool to alleviate non-technical people accomplish sense of total that data.

QuickSight, now in preview, is a “very mercurial cloud-powered commerce intelligence service,” according to Amazon (AMZN) senior vice president Andy Jassy, speaking at AWS Re:invent in Las Vegas.

QuickSight promises to enable commerce users to peer inside the data residing in various AWS repositories, including RedShift data warehouse, Elastic MapReduce, and Amazon relational database services. There they can espy relationships between data, the company said.

Then a tool called Autograph can alleviate them build an intelligence profile about total that data, said Matt Wood, general manager of product strategy for AWS.

But some, including Forrester Research senior analyst Paul Miller, espy SPICE (for Super-fast Parallel In-memory Calculation Engine), as the actual key here.

This is fragment of Amazon’s thrust to accomplish services apposite to commerce users as well as the developers who absorb driven the public cloud’s success so far.

Oh, and QuickSight will exist 1/10th the cost of traditional BI tools, which went unnamed although Jassy famed a tool with a designation that rhymes with “Hognos,” a reference to IBM’s (IBM) Cognos BI tool but it’s not arduous to espy QuickSight competing down the road with tools fancy Tableau as well, Miller said.

This narrative will exist updated during the Re:Invent keynote.

For more on Amazon Web Services check out the video:

Subscribe to Data Sheet, Fortune’s daily newsletter on the commerce of technology.

This narrative was updated with additional analyst comment and to redress the spelling of QuickSight.


Recapping the mediate 2019 “Journey to AI” community CrowdChat: AI everywhere | killexams.com actual questions and Pass4sure dumps

Artificial intelligence is transforming every commerce process. Developers are incorporating AI – in the configuration of abysmal learning, machine learning and kindred technologies – into cloud-native applications and commerce processes through tools that enable them to compose these features as data-driven microservices.

On Thursday, IBM Corp. and SiliconANGLE’s sister market research solid Wikibon held a #Think2019 conference community CrowdChat to debate how enterprises can accomplish the journey to AI in the cloud. The hourlong online session was well-attended and there was vibrant discussion of many issues related to the journey to AI.

The CrowdChat featured the following IBM AI topic matter experts: Carlo Appugliese, Data Science and Machine Learning; Matthias Funke, Hybrid Data Management; Madhu Kochar, Cross-analytics; Hemanth Manda, IBM Cloud Private for Data; Anantha Narasimhan, UGI; and Jason Tavoularis, commerce Analytics.

Here were the most noteworthy responses from these and other participants to each of the CrowdChat questions:

Q: Rob Thomas, general manager of IBM Analytics, has said there is no AI without an IA or information architecture. How are you modernizing your data estate — the organization of your data assets — to rep ready for an AI and multicloud world?

Katie Schafer: “If anyone is looking to learn more about ICP for Data, exist sure to check-out session #2571, titled: Change the Game: Learn How to Win with AI happening on Wednesday, February 13th at 1:30pmPST in the great Theater on the Data & AI Campus.”

Hemanth Manda: “Having talked to a number of customers and commerce partners, this is an issue everyone is grappling with and they are addressing it through their current platform offering ICP for Data, an integrated data and AI platform for multi-cloud”

Madhu Kochar: “Every client discussion starts with this dialog … and very censorious to absorb a trusted analytics data foundation. It starts with know your data, faith your data and utilize your data to further drive insights”

Matthias Funke: “I espy this question approach up everywhere. Modernization to gain agility, current insights faster, and absorb more people and commerce application capitalize from it … very often one needs to start at the bottom of the AI ladder and the question: How can I collect total the data I need, and accomplish it accessible to the perquisite people, at the perquisite time? And how can I integrate data assets across different locations and data sources?”

Carlo Appugliese: “We toil with clients on their Data Science Journey and biggest factor to winning with AI is to accomplish sure you account for 3 things … The perquisite skills, the perquisite process/ culture and finally the redress tools.”

Jason Tavoularis: “Of course! AI requires data. if there’s no infrastructure, there can’t exist much data, so you can’t hope the AI to exist very smart.”

Anantha Narasimhan: “Our customers are looking at AI to alleviate drive digital and potentially commerce transformation. At the core of AI are a) People & Culture, b) Process, c) Data … With data present total across the organization, getting a helpful ply on it is the very first step … Collect, Organize and then dissect data. and then Infuse AI models in order to operationalize … ML is a considerable enabler for AI. They requisite to recall that AI can alleviate us win quickly.. or tumble flat quickly. Because if the data is not of helpful quality, the models will hurl up atrocious insights”

Tanmay Sinha: “Quality of AI models is directly proportional to the quality of data used to train the model. Without an information architecture to serve high-quality data, the AI models can exist inconsistent, extraneous or worse biased.”

John Furrier: “I mediate that he’s really nailing the core AI (and ML) angle meta data or information that feeds AI engines is super important. If companies rep this perquisite then ML and AI soar to current heights”

Jameskobielus: “There’s no practical AI without data quality, governance, prep, and training in a high-performance data lake. Modernizing your data estate in the multicloud for AI demands an industrialized DevOps approach that automates much/most of these processes … AI can’t exist smart if data scientists can find the perquisite data to drive feature engineering etc. Likewise, AI models can’t finish their jobs with high self-possession without upfront and ongoing training from fresh operational data … Infusing AI into the commerce requires that an operationalized data science pipeline with a strong real-time/streaming CI/CD workflow.”

David Floyer: “IMO, the future for analytics is real-time results. This means mercurial execution of operational AI/Analytics near the data. It too means low-latency connections between applications wanting to automate processes and the AI/analytics required … For example, if you are wanting to ensure that only employees are entering enterprise premises, there will exist many enterprises with the identical problem, and many solutions to purchase … 1. There are two sources of AI solutions – internal, and external, the timehonored accomplish or buy decision. For products and services owned, it is vital that data is collected about those services in IA. However, there are many technologies it would exist easier to buy.”

Q: How are you increasing workload and consumption flexibility in your analytics systems?

Katie Schafer: “To learn more about how you can build a proper data architecture to help data accessibility, don’t miss The Road to AI—A Journey to Modernize Your Data Architecture session on Wednesday, February 13th at 3:30pmPST on the Data & AI Campus.”

Carlo Appugliese: “In Data Science … The key to success is replete access to total data.. In my experience, this is feverish topic and there is a equilibrium they absorb to play between Security and Innovation….My sustain is replete access to data for your Data Scientist and Data Engineers is censorious to your commerce innovating….Here is sessions at #IBMThink where Experian will Go into detail about their AI journey. https://myibm.ibm.com/events/think/all-sessions/session/6869A  …. Here is a blog where I explained a recent AI project working with Experian. https://www.ibmbigdatahub.com/blog/how-data-science-elite-helped-uncover-gold-mine-experian”

Matthias Funke: “This is gold to me. Having a catalog of data assets at my disposal without worrying about where data resides. Avoid or minimize data movement to avoid lag and cost is of tremendous value.”

Madhu Kochar: “As I talk to multiple clients, access to data especially shadowy data is critical. It is too principal that they absorb helpful data virtualization story, acceptation you finish not always to trek your data…. Capability to associate your traditional data with IOT data, actual time streaming data is censorious to drive current analytics insights”

Jennifer Shin: “the notion of being able to access total data sounds fancy a dream I had once… then I woke up and remembered I toil with people and data is mess. The reality is they can absorb total the data in the world, but it’s useless if it’s not accurate or of poverty-stricken quality…. In a competitive market, there will always exist businesses and both internal and external clients who want their data to exist kept private if it provides an advantage. Being able to access the data I requisite when I requisite is more principal than having access to total of it.”

David Floyer: “It is essential to absorb multiple sources of data around key commerce processes, products and services. The quality of AI/Advanced Analytics will exist dictated by the quality of the data sources.”

Q: Does your analytics strategy assume to trek data to analytics or analytics to data? Why?

Anantha Narasimhan: “Definitely analytics coming to data – so faster conclusion can exist taken at source or nearby to it…btw, there is an exciting session on Data Modernization strategy in a Multicloud World – by Madhu Kochar: https://myibm.ibm.com/events/think/all-sessions/session/7235A and virtualization: https://myibm.ibm.com/events/think/all-sessions/session/7223A”

Katie Schafer: “For more on business-ready data, don’t miss the Digital Transformation: A commerce Ready Data Hub for Advanced Analytics session at mediate 2019 happening Friday, February 15th at 9:30amPST on the Data & AI Campus.”

Madhu Kochar: “Data Gravity rules! You bring analytics to data, that is the most optimal…. Especially the world of multi-cloud strategy this is censorious that they champion data where it is, thus technology fancy data virtualization, having governance built in to faith the data drives to trusted AI”

Matthias Funke: “Analytics to data. Any data movement or copying is expensive and leads to total kinds of issues (lineage, quality, latency, higher resource utilization and cost)”

Hemanth Manda: “always trek Analytics to Data .. that’s been their mantra . Data gravity should prescribe your strategy. stirring against the gravity means you would discontinuance up spending a ton of resources / money & is not sustainable”

Carlo Appugliese: “in my opinion, finish your analytics where the data is if you can.. There is no value in stirring lots of data, but there is significant commerce value in doing more analytics with your data. Its total about rate and pace of AI projects.”

Tanmay Sinha: “Data is growing exponentially within an enterprise. stirring becomes an avoidable expense if you can bring analytics to your data!”

Sarbjeet Johal: “When doing #ML #AI, for compute intensive scenarios fancy human genome sequencing buy data to compute. For data intensive scenarios (especially input), bring compute to data. #rethink”

Jennifer Shin: “In my experience, companies already collecting data find value in turning data into analytics, whereas companies developing current products or services find more value in using analytics for data. The best #datascience teams needs to find the equilibrium in doing both”

Jameskobielus: “In-situ/in-database analytics is a key foundation fo the expansive data revolution. Data gravity. Now with the edge looming larger as a data source, analytics is stirring closer to those nodes and getting more sophisticated there. Distributed AI.”

David Floyer: “Data in volume is costly to trek & takes a lot of time. Data loses value over time. so, it is usually much cheaper to trek code to data than data to code. This is especially loyal for operational AI/analytics, which should exist moved nearby to data source where possible.…It is enchanting to observe that when AI systems are deployed, 90%+ of the code is in operational AI, rather than ML model development.”

Q: How policy-driven are your data analytics visibility, detection and reporting activities?

Carlo Appugliese: “One of the biggest I’ve seen is that companies mediate they are behind vs other companies.. What companies requisite to understand is that its a journey and they just requisite to start. most companies are learning and growing in this space…..I recommend, pick the one utilize case, do wee team on the project and start. If it fails, that is normal. just goto next one and for the wins, it will cancel out many failed projects.”

Tanmay Sinha: “To ensure unbiased AI models, policies on data analytics are more principal than ever.”

Hemanth Manda: “very puny to exist honest & I mediate this is huge issue given increased and diverse regulations , GDPR being the latest… Hemanth Manda…Here is a session on Data Virtualization @ mediate 2019 that would exist very valuable to attend :  https://myibm.ibm.com/events/think/all-sessions/session/7181A “

Matthias Funke: “How principal are helpful policies if their ratification is not automated? abysmal integration across the analytics ‘stack’ can resolve for that”

Madhu Kochar: “Every CDO would want to teach YES to this. requisite ML/AI based solutions to automate these activities, and they in IBM analytics absorb solutions to accomplish this effortless (a arduous problem)”

Jennifer Shin: “there’s always a policy, but the restrictions depend on the purpose associated with how the data is being used. When my #datascience team built models for negotiation purposes, even their internal status reports listed their toil as confidential.”

John Furrier: “Policy driven will exist a very principal portion of a machine driven future. Getting policy down and having machines pattern out current policies on the flee address both on demand AI and actual time AI”

Sarbjeet Johal: “ML and AI are next frontiers in Data Governance Platforms and these models will toil in conjunction with policies! So it’s “policy driven ML enabled” approach which seems most practical with the tools they absorb today!”

Q: Are protection and compliance regimes built into your analytics systems, or bolted on? Why?

Matthias Funke: “I espy it as a never-ending journey. One is never done. There is a legacy to start with, but every moment, current data (sources) may rep added to your current landscape. Fun!”

Tanmay Sinha: “Data privacy regulations are coming whether they fancy them or not. GDPR is already here, CCPA is coming soon. Enterprises, wee and large, absorb to starting thinking about the data being collected and shared.”

Jennifer Shin: “#analytics systems typically absorb several layers of protection and compliance regimes. accessing the platform is at a system even whereas anonymizing data depends on the data set (as well any contracts associated with it)”

David Floyer: “Early days for establishing compliance and protection policies. It will probably requisite a company to absorb a Wall Street Journal disaster to focus minds on this issue!”

Q: How does your organization administer profiling, cleansing and cataloging of data?

Anantha Narasimhan: “this is perhaps the core of organization’s journey to AI or even to a successful Data Lake, Data Science…. there is an excellent session at THINK, hosted by Jay @jaylimburn -https://myibm.ibm.com/events/think/all-sessions/session/6913A …. some organizations refer to this as Data Preparation or Data Curation…. Here’s a helpful session at THINK, in case you are interested: https://myibm.ibm.com/events/think/all-sessions/session/6912A”

Carlo Appugliese: “In area of Data Science, typically they comprise a Data Engineer who toil side by side with Data Scientist and are censorious to buy findings and do into Catalog as well as provide key features needed to modeling phase…. You requisite a combination of a cross frictional team, the perquisite access to data and tools to build your AI foundation…. One the expansive areas they espy in AI is aptitude to complicated what your predictive models are doing and finish you faith them.. Let me interrogate everyone, finish you faith the conclusion made by an AI/ML model?…Model jaundice is something they are very focused on, especially from a dev ops perspective. Understanding this is principal and censorious to your organizations future as you incorporate key decisions using AI. So faith AI but verify :)”

Sarbjeet Johal: “it’s mainly done at LOB even in most of the companies I absorb worked with in advisory capacity. Central tools, policies and procedures requisite to exist built for data governance. I believe the WHAT of data cleansing and cataloging must sojourn with LOB and HOW with IT.”

Hemanth Manda: “as usual, there are multiple solutions too ply this, but ICP for Data is a platform that includes and enforces these capabilities by default .. Learn more @ this mediate session : https://myibm.ibm.com/events/think/all-sessions/session/5478A….here is a 3rd party listing of vendors offering cleansing tools : https://www.analyticsindiamag.com/10-best-data-cleaning-tools-get-data/“

Madhu Kochar: “Besides Profiling, cleansing, cataloging, Data classification is another censorious attribute. Here is where Ml automation can Go a long way. IBM Information Server provides complete solution”

Pouya Fakhari: “An edge computing approach is made for the concept of the data warehouse, while pure cloud computing fundamentally contradicts the concept. It is generally accepted that only edge computing makes sense for systems that collect data on a massive scale thoughts hybrid cloud edge…. E. g. an Edge Computing Device can outsource simple computing tasks to a cloud using a Function-As-A-Service concept. Here, the cloud does not store anything and no backend is set up on it. The cloud only offers computing power for any functions that are transmitted on the fly

Matthias Funke: “Would disagree if you mediate about IoT utilize cases with massive volumes of data points continuously produced. Aggregation and storage can befall at the edge. It’s not just data warehousing though.”

Jennifer Shin: “I absorb yet to espy a organization that has this process streamlined. Most established companies absorb many, many meetings about how data set is going to exist used internally and the logistics around it…. one of the advantages of structure cutting edge tech and creating current data products/services is that this is dealt with further down the line”

David Floyer: “This an principal requirement in the maturing of AI/advanced analytics. Solutions should champion distributed and multi-cloud data, and ideally champion orchestration and optimization of stirring code to data or vice versa.”

John Furrier: “Clean data in —> considerable ML and AI; not cleanly data in –> lots of cleanup. Just teach no to data pollution!!”

Q: What resources alleviate your enterprise deploy models anywhere, securely?

Madhu Kochar: “A built in governance for these models is censorious as well.. so you really requisite data engineers, data scientist, data stewards requisite to collaborate”

Carlo Appugliese: “Using Watson Machine learning really gives you aptitude to train. deploy and monitor your models.. This really gives you model portability so you can train and deploy anywhere..”

Sarbjeet Johal: “Data Governance Policies + Data Governance Skills + Stated Policies. That covers total people processes and tech aspects.”

Carlo Appugliese: “If you’re looking to build a current Data Science Team?…Here is a blog I do out on how to build a rock star Data Science Team! https://www.ibm.com/blogs/business-analytics/rock-star-ibm-data-science-elite-team/ “

Jennifer Shin: “In my experience, IT and operations teams are very principal when you requisite to substantiate that certain governance is in site within an #analytics system or requisite a current policy to exist do in place… the best resource for deploying models anywhere, securely is a IT or technology team that is knowledgable, experienced and responsive!”

James Kobielus: “The core platform that enables enterprises to deploy models anywhere is a data-science CI/CD toolchain that can serve to any target device, node, hardware, container, and runtime environment. The “securely” requires taut access and integrity controls throughout.”

David Floyer: “End-to-end security from development, deployment, and updating is important, and not yet at total common!”

Q: How are your analytics users using data visualization and low-code evolution tooling?

Katie Schafer: “Here’s a considerable session that will showcase the current capabilities in IBM Cognos Analytics 11.1 and how it uses AI to provides smarter self-service analytics:  https://myibm.ibm.com/events/think/all-sessions/session/3651A “

Anantha Narasimhan: “based on prior experience, when they want to accelerate self-service analytics, low code/no code become important… with Cognos Analytics 11.1, commerce Users can utilize natural language queries to rep insights into data.. and stunning visualization to clearly status trends or issues (sorry – unblushing plug in) :)”

Matthias Funke: “I espy two categories of analytics users: Data Scientists using dev tooling fancy jupyter notebooks and OSS visualization libraries, vs LoB users using canned reports and dashboards.”

Hemanth Manda: “I tried using Tableau, but gave up after a few days. Nothing beats Cognos especially after the latest improvements in 11.1”

James Kobielus: “Increasingly, analytics developers are using declarative, visual, low-code tooling to program AI/ML, with the tooling leveraging auto-ML to compile models for optimized execution on target platforms…. Analytics commerce users are too using self-service, visual tooling to build predictive and other advanced analytics for conclusion support–eg Cognos…. ML-driven augmented programming, leveraging low-code visual front-ends, is a huge research focus here at Wikibon. espy my report from a year ago: https://wikibon.com/augmented-programming-ml-development/ ”

Jennifer Shin: “I find more teams are using #datavisualization across an organization ranging from creating a realtime dashboard for the c suite to using it as a a tracking tool for day to dat operations.”

Q: What is your organization doing to manage and mitigate jaundice in your models?

Katie Schafer: Here’s a session happening at mediate 2019 that will dive into Detecting and Mitigating jaundice in AI: https://myibm.ibm.com/events/think/all-sessions/session/3449A”

Carlo Appugliese: “What I’ve seen is companies are doing this manually but after the fact and really tumble short.. This is topic needs to exist evaluated in the dawn of your model development. They can really alleviate companies with this using tools.”

Madhu Kochar: “Bias in AI a very feverish topic and critical. There are considerable examples, i will share later on how many societal biases are in their datasets. So they really requisite tools and technology to alleviate on data traceability, explainability”

Jennifer Shin: “All models will absorb jaundice because they live in a world without consummate information, which is why being able to communicate the extent that the jaundice poses a risk is so essential in #AI….The best passage to manage and mitigate jaundice in your model is to understand #statistics, #mathematics, #data, #science, #engineering and people…. algorithms aren’t in and of themselves bias, but it can multiply the jaundice depending on how it is designed… Developing commandeer reporting and monitoring for models and algorithms implemented in productions is essential for limiting bias”

Steve Ardire: “Most people mediate algorithms are objective but in great fragment they’re opinions embedded in code. AI systems are black boxes; the data goes in and the retort comes out without an explanation for the decision. Algorithms that learn are conjectural to become more accurate unless jaundice intrudes and amplifies stereotypes….Current ML models understand what’s explicitly stated, but less helpful at anticipating what’s not said or implied…@DameWendyDBE University of Southampton, Growing role of #AI in their lives is ‘too principal to leave to men’ …Must develop efficient mechanisms in algorithms to filter out biases and build ethics into AI with aptitude to read between the lines or what requires common sense.”

James Kobielus: “Debiasing models starts with debiasing data. Here’s a piece I published on the emerging best practices in this. From eventual year: https://www.informationweek.com/big-data/ai-machine-learning/debiasing-our-statistical-algorithms-down-to-their-roots/a/d-id/1331852

David Floyer: “This is an principal faith issue! If a company is shown not exist absorb addressed this issue, there are ascetic risk of brand damage. E.g., a store with cameras with AI to alleviate employees meet, greet or challenge customers entering the store should exist especially careful!”

Sarbjeet Johal: “always exist training your models! Context injection mechanisms are poverty-stricken with current toolings but they are sensible of this problem, that means, they are on their passage to resolve it!…. you absorb to remove jaundice from data input! Algos aren’t bias, data is! Always champion that in mind!”

Here’s the replete transcript of the CrowdChat and the polls. And save this date for the Journey to Cloud CrowdChat, 9 a.m. PST Jan. 30, at https://www.crowdchat.net/think2019.

Image: Marcus Spiske/Unsplash

Limited mode

https://myibm.ibm.com/events/think/all-sessions/session/6869A….

×

Since you’re here …

… We’d fancy to inform you about their mission and how you can alleviate us fulfill it. SiliconANGLE Media Inc.’s commerce model is based on the intrinsic value of the content, not advertising. Unlike many online publications, they don’t absorb a paywall or race banner advertising, because they want to champion their journalism open, without influence or the requisite to chase traffic.The journalism, reporting and commentary on SiliconANGLE — along with live, unscripted video from their Silicon Valley studio and globe-trotting video teams at theCUBE — buy a lot of arduous work, time and money. Keeping the quality high requires the champion of sponsors who are aligned with their vision of ad-free journalism content.

If you fancy the reporting, video interviews and other ad-free content here, please buy a moment to check out a sample of the video content supported by their sponsors, tweet your support, and champion coming back to SiliconANGLE.



Direct Download of over 5500 Certification Exams

3COM [8 Certification Exam(s) ]
AccessData [1 Certification Exam(s) ]
ACFE [1 Certification Exam(s) ]
ACI [3 Certification Exam(s) ]
Acme-Packet [1 Certification Exam(s) ]
ACSM [4 Certification Exam(s) ]
ACT [1 Certification Exam(s) ]
Admission-Tests [13 Certification Exam(s) ]
ADOBE [93 Certification Exam(s) ]
AFP [1 Certification Exam(s) ]
AICPA [2 Certification Exam(s) ]
AIIM [1 Certification Exam(s) ]
Alcatel-Lucent [13 Certification Exam(s) ]
Alfresco [1 Certification Exam(s) ]
Altiris [3 Certification Exam(s) ]
Amazon [2 Certification Exam(s) ]
American-College [2 Certification Exam(s) ]
Android [4 Certification Exam(s) ]
APA [1 Certification Exam(s) ]
APC [2 Certification Exam(s) ]
APICS [2 Certification Exam(s) ]
Apple [69 Certification Exam(s) ]
AppSense [1 Certification Exam(s) ]
APTUSC [1 Certification Exam(s) ]
Arizona-Education [1 Certification Exam(s) ]
ARM [1 Certification Exam(s) ]
Aruba [6 Certification Exam(s) ]
ASIS [2 Certification Exam(s) ]
ASQ [3 Certification Exam(s) ]
ASTQB [8 Certification Exam(s) ]
Autodesk [2 Certification Exam(s) ]
Avaya [96 Certification Exam(s) ]
AXELOS [1 Certification Exam(s) ]
Axis [1 Certification Exam(s) ]
Banking [1 Certification Exam(s) ]
BEA [5 Certification Exam(s) ]
BICSI [2 Certification Exam(s) ]
BlackBerry [17 Certification Exam(s) ]
BlueCoat [2 Certification Exam(s) ]
Brocade [4 Certification Exam(s) ]
Business-Objects [11 Certification Exam(s) ]
Business-Tests [4 Certification Exam(s) ]
CA-Technologies [21 Certification Exam(s) ]
Certification-Board [10 Certification Exam(s) ]
Certiport [3 Certification Exam(s) ]
CheckPoint [41 Certification Exam(s) ]
CIDQ [1 Certification Exam(s) ]
CIPS [4 Certification Exam(s) ]
Cisco [318 Certification Exam(s) ]
Citrix [48 Certification Exam(s) ]
CIW [18 Certification Exam(s) ]
Cloudera [10 Certification Exam(s) ]
Cognos [19 Certification Exam(s) ]
College-Board [2 Certification Exam(s) ]
CompTIA [76 Certification Exam(s) ]
ComputerAssociates [6 Certification Exam(s) ]
Consultant [2 Certification Exam(s) ]
Counselor [4 Certification Exam(s) ]
CPP-Institue [2 Certification Exam(s) ]
CPP-Institute [1 Certification Exam(s) ]
CSP [1 Certification Exam(s) ]
CWNA [1 Certification Exam(s) ]
CWNP [13 Certification Exam(s) ]
Dassault [2 Certification Exam(s) ]
DELL [9 Certification Exam(s) ]
DMI [1 Certification Exam(s) ]
DRI [1 Certification Exam(s) ]
ECCouncil [21 Certification Exam(s) ]
ECDL [1 Certification Exam(s) ]
EMC [129 Certification Exam(s) ]
Enterasys [13 Certification Exam(s) ]
Ericsson [5 Certification Exam(s) ]
ESPA [1 Certification Exam(s) ]
Esri [2 Certification Exam(s) ]
ExamExpress [15 Certification Exam(s) ]
Exin [40 Certification Exam(s) ]
ExtremeNetworks [3 Certification Exam(s) ]
F5-Networks [20 Certification Exam(s) ]
FCTC [2 Certification Exam(s) ]
Filemaker [9 Certification Exam(s) ]
Financial [36 Certification Exam(s) ]
Food [4 Certification Exam(s) ]
Fortinet [13 Certification Exam(s) ]
Foundry [6 Certification Exam(s) ]
FSMTB [1 Certification Exam(s) ]
Fujitsu [2 Certification Exam(s) ]
GAQM [9 Certification Exam(s) ]
Genesys [4 Certification Exam(s) ]
GIAC [15 Certification Exam(s) ]
Google [4 Certification Exam(s) ]
GuidanceSoftware [2 Certification Exam(s) ]
H3C [1 Certification Exam(s) ]
HDI [9 Certification Exam(s) ]
Healthcare [3 Certification Exam(s) ]
HIPAA [2 Certification Exam(s) ]
Hitachi [30 Certification Exam(s) ]
Hortonworks [4 Certification Exam(s) ]
Hospitality [2 Certification Exam(s) ]
HP [750 Certification Exam(s) ]
HR [4 Certification Exam(s) ]
HRCI [1 Certification Exam(s) ]
Huawei [21 Certification Exam(s) ]
Hyperion [10 Certification Exam(s) ]
IAAP [1 Certification Exam(s) ]
IAHCSMM [1 Certification Exam(s) ]
IBM [1532 Certification Exam(s) ]
IBQH [1 Certification Exam(s) ]
ICAI [1 Certification Exam(s) ]
ICDL [6 Certification Exam(s) ]
IEEE [1 Certification Exam(s) ]
IELTS [1 Certification Exam(s) ]
IFPUG [1 Certification Exam(s) ]
IIA [3 Certification Exam(s) ]
IIBA [2 Certification Exam(s) ]
IISFA [1 Certification Exam(s) ]
Intel [2 Certification Exam(s) ]
IQN [1 Certification Exam(s) ]
IRS [1 Certification Exam(s) ]
ISA [1 Certification Exam(s) ]
ISACA [4 Certification Exam(s) ]
ISC2 [6 Certification Exam(s) ]
ISEB [24 Certification Exam(s) ]
Isilon [4 Certification Exam(s) ]
ISM [6 Certification Exam(s) ]
iSQI [7 Certification Exam(s) ]
ITEC [1 Certification Exam(s) ]
Juniper [64 Certification Exam(s) ]
LEED [1 Certification Exam(s) ]
Legato [5 Certification Exam(s) ]
Liferay [1 Certification Exam(s) ]
Logical-Operations [1 Certification Exam(s) ]
Lotus [66 Certification Exam(s) ]
LPI [24 Certification Exam(s) ]
LSI [3 Certification Exam(s) ]
Magento [3 Certification Exam(s) ]
Maintenance [2 Certification Exam(s) ]
McAfee [8 Certification Exam(s) ]
McData [3 Certification Exam(s) ]
Medical [69 Certification Exam(s) ]
Microsoft [374 Certification Exam(s) ]
Mile2 [3 Certification Exam(s) ]
Military [1 Certification Exam(s) ]
Misc [1 Certification Exam(s) ]
Motorola [7 Certification Exam(s) ]
mySQL [4 Certification Exam(s) ]
NBSTSA [1 Certification Exam(s) ]
NCEES [2 Certification Exam(s) ]
NCIDQ [1 Certification Exam(s) ]
NCLEX [2 Certification Exam(s) ]
Network-General [12 Certification Exam(s) ]
NetworkAppliance [39 Certification Exam(s) ]
NI [1 Certification Exam(s) ]
NIELIT [1 Certification Exam(s) ]
Nokia [6 Certification Exam(s) ]
Nortel [130 Certification Exam(s) ]
Novell [37 Certification Exam(s) ]
OMG [10 Certification Exam(s) ]
Oracle [279 Certification Exam(s) ]
P&C [2 Certification Exam(s) ]
Palo-Alto [4 Certification Exam(s) ]
PARCC [1 Certification Exam(s) ]
PayPal [1 Certification Exam(s) ]
Pegasystems [12 Certification Exam(s) ]
PEOPLECERT [4 Certification Exam(s) ]
PMI [15 Certification Exam(s) ]
Polycom [2 Certification Exam(s) ]
PostgreSQL-CE [1 Certification Exam(s) ]
Prince2 [6 Certification Exam(s) ]
PRMIA [1 Certification Exam(s) ]
PsychCorp [1 Certification Exam(s) ]
PTCB [2 Certification Exam(s) ]
QAI [1 Certification Exam(s) ]
QlikView [1 Certification Exam(s) ]
Quality-Assurance [7 Certification Exam(s) ]
RACC [1 Certification Exam(s) ]
Real-Estate [1 Certification Exam(s) ]
RedHat [8 Certification Exam(s) ]
RES [5 Certification Exam(s) ]
Riverbed [8 Certification Exam(s) ]
RSA [15 Certification Exam(s) ]
Sair [8 Certification Exam(s) ]
Salesforce [5 Certification Exam(s) ]
SANS [1 Certification Exam(s) ]
SAP [98 Certification Exam(s) ]
SASInstitute [15 Certification Exam(s) ]
SAT [1 Certification Exam(s) ]
SCO [10 Certification Exam(s) ]
SCP [6 Certification Exam(s) ]
SDI [3 Certification Exam(s) ]
See-Beyond [1 Certification Exam(s) ]
Siemens [1 Certification Exam(s) ]
Snia [7 Certification Exam(s) ]
SOA [15 Certification Exam(s) ]
Social-Work-Board [4 Certification Exam(s) ]
SpringSource [1 Certification Exam(s) ]
SUN [63 Certification Exam(s) ]
SUSE [1 Certification Exam(s) ]
Sybase [17 Certification Exam(s) ]
Symantec [134 Certification Exam(s) ]
Teacher-Certification [4 Certification Exam(s) ]
The-Open-Group [8 Certification Exam(s) ]
TIA [3 Certification Exam(s) ]
Tibco [18 Certification Exam(s) ]
Trainers [3 Certification Exam(s) ]
Trend [1 Certification Exam(s) ]
TruSecure [1 Certification Exam(s) ]
USMLE [1 Certification Exam(s) ]
VCE [6 Certification Exam(s) ]
Veeam [2 Certification Exam(s) ]
Veritas [33 Certification Exam(s) ]
Vmware [58 Certification Exam(s) ]
Wonderlic [2 Certification Exam(s) ]
Worldatwork [2 Certification Exam(s) ]
XML-Master [3 Certification Exam(s) ]
Zend [6 Certification Exam(s) ]





References :


Dropmark : http://killexams.dropmark.com/367904/11897626
Wordpress : http://wp.me/p7SJ6L-253
Dropmark-Text : http://killexams.dropmark.com/367904/12878386
Blogspot : http://killexamsbraindump.blogspot.com/2017/12/people-used-these-ibm-dumps-to-get-100_37.html
RSS Feed : http://feeds.feedburner.com/JustStudyTheseIbmCog-605QuestionsAndPassTheRealTest
Box.net : https://app.box.com/s/1b5vwgrubicb3p4xmbwa273lgidv8hzs











Killexams COG-605 exams | Killexams COG-605 cert | Pass4Sure COG-605 questions | Pass4sure COG-605 | pass-guaratee COG-605 | best COG-605 test preparation | best COG-605 training guides | COG-605 examcollection | killexams | killexams COG-605 review | killexams COG-605 legit | kill COG-605 example | kill COG-605 example journalism | kill exams COG-605 reviews | kill exam ripoff report | review COG-605 | review COG-605 quizlet | review COG-605 login | review COG-605 archives | review COG-605 sheet | legitimate COG-605 | legit COG-605 | legitimacy COG-605 | legitimation COG-605 | legit COG-605 check | legitimate COG-605 program | legitimize COG-605 | legitimate COG-605 business | legitimate COG-605 definition | legit COG-605 site | legit online banking | legit COG-605 website | legitimacy COG-605 definition | >pass 4 sure | pass for sure | p4s | pass4sure certification | pass4sure exam | IT certification | IT Exam | COG-605 material provider | pass4sure login | pass4sure COG-605 exams | pass4sure COG-605 reviews | pass4sure aws | pass4sure COG-605 security | pass4sure cisco | pass4sure coupon | pass4sure COG-605 dumps | pass4sure cissp | pass4sure COG-605 braindumps | pass4sure COG-605 test | pass4sure COG-605 torrent | pass4sure COG-605 download | pass4surekey | pass4sure cap | pass4sure free | examsoft | examsoft login | exams | exams free | examsolutions | exams4pilots | examsoft download | exams questions | examslocal | exams practice |



This is the first installment on a series of various sandboxing techniques that I’ve used in my own code to restrict an applications capabilities. You can find a shorter overview of these techniques here. This article will be discussing seccomp filters.

What is Seccomp? An Introduction:

System calls are your way of asking the kernel to do something for you. You send a message saying “Hey, open a file for me” and it’ll probably do it for you, barring permission errors or some other issue. But, if you can talk to the kernel, you can exploit the kernel. Many vulnerabilities are found in kernel system calls, leading to full root privileges – bypassing sandboxing techniques like SELinux, Apparmor, namespaces, chroots, you name it. So, how do we deal with this without patching the kernel, as a developer? Seccomp filters.

Seccomp is a way for a program to register a set of rules with the kernel. These rules deal with the system calls a program can make, and which parameters it can send with them.

When you create your rules you get a nice overview of your kernel attack surface. Those calls are the ways your attacker can attack the kernel. On top of that ,you’ve just reduced kernel attack surface – if an attacker requires system call A and you’ve only allowed system calls B through D, they can’t attack with system call A.

Another nice benefit is the ability to restrict capabilities. If your program never writes a file, don’t give it access to the write() system call. Now you’ve reduced the kernel attack surface, but you’ve also stopped the program from writing files.

The Code:

Seccomp code is fairly simple to use, though I haven’t found any really good documentation. Here is the seccomp code used in my program, SyslogParse, to restrict its system calls.


scmp_filter_ctx ctx;
ctx = seccomp_init(SCMP_ACT_KILL);

seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(rt_sigreturn), 0);
seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(exit), 0);
seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(exit_group), 0);
seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(tgkill), 0);

seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(access), 0);

seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(write), 2,
SCMP_A0(SCMP_CMP_GE, 1),
SCMP_A0(SCMP_CMP_LE, 2)
);

seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(fstat), 0);
seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(open), 0);
seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(close), 0);

seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(brk), 0);

seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(mprotect), 1,
SCMP_A2(SCMP_CMP_NE, PROT_EXEC)
);

seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(mmap), 2,
SCMP_A0(SCMP_CMP_EQ, NULL),
SCMP_A5(SCMP_CMP_EQ, 0)
);

seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(munmap), 2,
SCMP_A0(SCMP_CMP_NE, NULL),
SCMP_A1(SCMP_CMP_GE, 0)
);

seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(madvise), 0);
seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(futex), 0);
seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(execve), 0);
seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(clone), 0);

seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(getrlimit), 0);
seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(rt_sigaction), 0);
seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(rt_sigprocmask), 0);
seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(set_robust_list), 0);
seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(set_tid_address), 0);

if(seccomp_load(ctx) != 0) //activate filter
err(0, “seccomp_load failed”);

I’ll go through this bit by bit.


scmp_filter_ctx ctx;
ctx = seccomp_init(SCMP_ACT_KILL);

This should be fairly simple to understand if you’ve written basically any code. This instantiates the seccomp filter, “ctx”, and then initializes it to kill on rule violations. Simple.


seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(futex), 0);

This line is a rule for the “futex” system call. The first parameter, “ctx”, is our instantiated filter. “SCMP_ACT_ALLOW”, the second parameter, is saying to allow when a condition is met. The third is a macro for the futex() system call, as that’s the call we want to allow through the filter. The last parameter, “0”, is how many rules we want to add to this that deal with parameters.

Simple. So this rule will allow any futex system call regardless of parameters.

I chose futex in this example to demonstrate that seccomp can not protect you from every attack. Despite the heavy amount of sandboxing I’ve done in this program, this filter will do nothing to stop attacks that use the futex system call. Recently, one vulnerability was found that could do just that – a call to futex would lead to control over the kernel. Seccomp just isn’t all powerful, but it’s a big improvement.

Note: I found all of these syscalls by repeatedly running strace on SyslogParse with different parameters. Strace will list all of the system calls as well as their arguments and makes creating rules very easy.


if(seccomp_load(ctx) != 0) //activate filter
err(0, "seccomp_load failed");

seccomp_load(ctx) will load up the filter and from this point on it is enforced. In this case I’ve wrapped it to ensure that it either loads properly or the program won’t run.

And that’s it. That’s all the code it takes. If the program makes a call to any other system call it crashes with “Bad System Call”.

Seccomp is quite easy to use and is the first thing I’d make use of if you are considering sandboxing. All sandboxing relies on a strong keernel, but as a developer you can only change your own program, and seccomp is a good way to reduce kernel attack surface and make all other sandboxes more effective.

Linux has something like 200 system calls (can’t find a good source, anyone know a more definitive number?), and SyslogParse has dropped that down to about 22. That’s a nice drop in privileges and attack surface.

Next Up: Linux Capabilities

Writing Sandboxed Software

I wrote a program recently, SyslogParse, to display apparmor and iptables rules based on violations found in my system log. I did this because my apparmor-utils packages always break / were quite slow when going through my profiles, and going through iptables rules in syslog was a big of a pain too.

I decided this would be a fun project to sort of “lock down” against theoretical attacks, and I’d like to blog that experience to demonstrate how to use these different sandboxing mechanisms, as well as how they make the program more secure.

What takes place below is after the process of designing the application from a functional point of view – “what do I need this thing to do?”.

Step One: Threat Modeling

This step was a little less important for SyslogParse, as I was going to secure it regardless of real-world threats, but I’ll explain how I went about threat modeling.

The first thing I did is figure out what permissions this SyslogParse needs. I know the application, by design, must read from /var/log/syslog – a file that you need root permissions to read from. So I’ll be running this as root in order to do work.

To make things easier for users who don’t log to syslog, I’ll take in a path parameter, which means someone running this program can specify an arbitrary input file. That is the attack surface – one file being taken in.

An attacker who can control content in that file can potentially escalate to root privileges.

Step Two: Seccomp Mode 2 Filters

I’ve discussed seccomp filters on my blog beforehand, but to give a short recap, seccomp filters are developer-defined rules that will dictate which system calls can be made, and do light validation on the parameters.

Seccomp filters are very simple to use, and they’re the first thing I implemented.

Here I declare the seccomp filter.

scmp_filter_ctx ctx;

Here I initialize it to kill the process when rules are violated.

ctx = seccomp_init(SCMP_ACT_KILL);

And here is an example of a seccomp rule being created.

seccomp_rule_add(ctx, SCMP_ACT_ALLOW, SCMP_SYS(futex), 0);

In the above rule I’ve said to allow the futex system call, a call used when a program uses threads and has to set mutexes. The “0” means I have no additional verification of the system calls arguments. In an ideal world I’d validate arguments to all of these calls, but it’s not always possible.

In the end I had about 22 calls, 3 of which I validate parameters on.

The thing about seccomp is that there’s no point doing sandboxing before I set this up, because without it the kernel will always be an easy target, and as many sandboxes as I layer on, I can’t change that from within my SyslogParse code – until I use Seccomp.

23 calls is quite a lot (though considerably less than it could be), and I chose futex as an example to show that despite limiting the calls, the recent futex requeue exploit would bypass this seccomp sandboxing and all other sandboxing this program uses. There’s only so much we can do from within the context of this program.

What is nice, however, is that I now know my kernel’s attack surface. Barring flaws in the seccomp enforcement, I know how my attacker can interact with the kernel, and that in itself is quite valuable.

Step Two: Chroot

By design SyslogParse must be root, in order to read root files, so that means I’ve got the chroot capability. May as well make use of it.

There’s a misconception that chroots are really poor security boundaries. This isn’t entirely false, but it’s not the whole story.

With one call I can set up a chroot environment that’s not so easy to break out of, at least it won’t be by the end of this article.

mkdir("/tmp/syslogparse/", 400);

That creates a folder /tmp/syslogparse/ with the permissions that only root can read from it. Right now we’re root, so we can read from it, but that won’t last too much longer (about two more steps).

chroot("/tmp/syslogparse/");

The file system as SyslogParse now knows it is an empty wasteland that only root can read and no one else can write to. A regular user would have no ability to read or write to it, which is nice because Inter-Process Communication (IPC) would require at least write access, and ideally read and write access.

Step Three: RLimit

For SyslogParse this is a bit unnecessary, but I went with it anyways.

rlrmit() is a system call you can make that will irreversibly limit that process in some way. In this case, because I want to limit IPC, and because SyslogParse only ever writes to stdout, which is already open, I’m going to tell it that it can not write to any new files.

struct rlimit rlp;
rlp.rlim_cur = 0;

setrlimit(RLIMIT_FSIZE, &rlp);

In a more literal way, I’ve told the system that my process can not write to a file that is larger than 0 bytes.

Step Four: Dropping Privileges

The last significant step in this sandbox is to lose root. In this case, dropping to user 65534, which, at least on my system, is the ‘nobody’ user. A more ideal situation would have SyslogParse drop to a completely nonexistent user (to avoid sharing a user with another process) but I’m going with this for now.

setgid(65534);

setuid(65534);

That’s all it takes – SyslogParse is now running as the nobody user/group. No more root, and the process is within a chroot environment that it has no permissions to read or write to.

Step Five: Apparmor

I’m on elementary OS, which has apparmor. So, in my makefile I’ve put an ‘mv’ command that puts my profile into the users apparmor directory.

For a small program like this the apparmor profile is very simple.

# Last Modified: Wed Aug 13 18:57:15 2014
#include <tunables/global>

/usr/bin/syslogparse flags=(complain){

/usr/bin/syslogparse mr,
/var/log/* mr,

/etc/ld.so.cache mr,

/sys/devices/system/cpu/online r,
/lib/@{multiarch}/libgcc_s.so* mr,
/lib/@{multiarch}/libc-*.so mr,
/lib/@{multiarch}/libm-*.so mr,
/lib/@{multiarch}/libpthread*.so mr,

/usr/lib/@{multiarch}/libseccomp.so* mr,
/usr/lib/@{multiarch}/libstdc*.so* mr,

}

 

A few library files, read access to /var/log/ (for arbitrary log files), and, because I threaded the process, it needs to read

/sys/devices/system/cpu/online.

The real benefit of this apparmor profile is that it takes effect before any code runs – the rest of the sandboxing all happens right after I open /var/log/syslog – there is very little code before it, but some, and a compromise at that point will lead to full root control of the process. With the apparmor profile the worst case scenario is that they have access to only what is listed there.

Conclusion

Overall, I think that’s a fairly robust sandbox. It was mostly for fun, but it was all fairly simple to implement.

If an attacker did break into this system, the above would make things a bit annoying, though the obvious path is to simply attack one of the allowed system calls, as I only validate parameters on 3 and there is clearly attack surface still left.

This isn’t bullet proof, and it’s not an excuse to not test your code. I fed SyslogParse garbage files/ unexpected input to make sure it failed gracefully/ err’d out immediately when it came across something it didn’t know how to deal with.

Lots of fun to write, and hopefully others can make use of this to make their programs a little bit stronger.

 

Windows XP Support Has Ended

A long time ago I posted an article entitled Windows XP – Abandon Ship. That was nearly one year ago today. And just a few days ago XP officially stopped getting support and patches from Microsoft.

I’d like to clear up some misconceptions that people still seem to have.

You can not be secure on Windows XP. In truth, it’s been a lost cause for quite some time, but Microsoft has been pretty good at dealing with threats through an active approach. Shatter attacks devastate XP machines due to poor privilege separation, but Microsoft addressed this issue decently with a few patches and by lowering service permissions.

Patches are not coming anymore. Support is gone. Do not expect the next big attack to be swiftly put down.

But what does that mean for you, XP user?

It could mean nothing – attackers may not care. We’ve never had such a widely used piece of software go out of support, so many people are still on XP. As far as I know this is unprecedented. Predictions are meaningless – I can not tell you what attackers will do, only what they can do.

So, as always, if you’re using XP or any unsecured system you will be playing a game of chance and not skill. It becomes ‘any attacker who wants to’ as opposed to ‘any attacker who can’ when it comes to getting into your system.

Is that a system you want to rely on?

I’ll also take this time to say that no one should be extending support for XP. Notably, Google Chrome will be continuing to patch XP. To me this is nothing but a false sense of security. Google Chrome relies heavily on its sandbox to protect its users, but any sandbox on Windows is going to rely entirely on a secure operating system. So the sandbox is very clearly not a huge barrier because the unpatched XP kernel and services will be easily leveraged for a full sandbox escape.

No one should be encouraged to use XP now. Take no pride in it- you’re gambling, that’s it.

“But I run EMET! You said EMET is great!”

EMET is awesome. And largely useless to an attacker on XP – while it’s a cute way to push back patch time on systems by a little bit it is by no means a significant barrier when basic memory corruption mitigations are not even supported on the operating system.

“But I run NoScript”

I love NoScript – great piece of software. But what will you do when a kernel vulnerability in text parsing is being used in the wild? You’ll get infected.

I really have very little to say here. XP is not securable. It wasn’t a year ago but it really more than ever is not.

I’m not saying you’ll get infected. I’m not saying that every XP machine will be linked to a botnet in a year. I’m saying that you are not secure, and anyone who wants to take advantage of that will not have a hard time.

Penetration Testing Report

So for one of my classes I had to perform a full penetration test on a server. It wasn’t particularly difficult but I figured I’d share the report here. I’ve done this twice now for the same class (different setups) and it’s been pretty fun.

This is all purposefully vulnerable stuff. It was a script kiddy stuff to get in but fun nonetheless. The report is written as if it had been handled by a legitimate team of pentesters.

Here’s the report.

[PDF] http://www.insanitybit.com/wp-content/uploads/2014/03/PentestReport.pdf