Company Blog

The New Face of Emplyment: Freelancers

  • Posted On  2015-07-13 10:08:46 by AryanIct.com Blog



Since the 2008, market failure and the subsequent problems following it, the face of employment has changed drastically. Employers are trying to develop new ways to lower the cost of doing business, not out of being cheap, but in actual need to keep their companies a float. One of the ways of doing this is hiring freelancers, or as they are more formally called, independent contractors.

Independent contractors are people who specialize usually in one or two types of work, in a particular field. There is everything from independent computer programmers to independent telemarketers out there. They come from all walks of life, and they have one thing in common. They are especially skilled at what they do. This is why they can branch out on their own and manage to be hired by others on a per project basis.

Benefits of hiring freelancers: The Employers View

When an employer is faced with a project that may only be lasting a few years, or a larger long term project that needs hands on it that are highly specialized and there is no one within the company with the qualifications, this is one example of when they might consider hiring a freelancer. With a freelancer, they are hired on a contract basis, and because of such are paid their fee as negotiated and agreed upon and are not considered full company employees. While they may be drawing a much higher per hour pay than the company employees, it bears remembering that when an employer hires a freelance worker, the company does not have to pay taxes on their pay or for benefits or any unemployment insurance; all of these are the responsibility of the freelancer to pay on their end.

When shopping for a freelancer for your position, there are many consulting agencies that are brick and mortar as well as those that are based online in their entirety. When speaking to your account representative make sure that you are abundantly clear about your expectations. This is very much a “what you see is what you get” business. Make sure that you look into the feedback of the firm or person that you hire and get references from previous employers.

What is in it for me? Contractors View

For the most part, the reason people become independent contractors is that they wish to have more variety in their situation and prefer to work their own hours instead of punching a clock each day at 8am and again at 5pm. To put yourself out there as an independent contractor, you are essentially telling the world that you are an expert in your field. Chances are if you’ve gone so far as to become a freelancer, you are an expert in your field.

Perhaps the biggest advantage of working this way is setting your own hours and making the final decisions as to what projects you take on and for how long. Your work gets evaluated on a daily basis and while for some that can be nerve wracking, for the right freelancer it is a freedom they would never give up.

OK, I’m sold. I want to hire/become a freelancer. What is my next step?

Hiring A Freelancer

You have made the decision to take that leap and hire a freelancer for your upcoming project. Let’s say that it is a new computer programming project in which the result is an elegant looking and user friendly user interface for this otherwise command line only program that the office uses every day. Your first step will be to choose if you will be using one of the brick and mortar agencies or one of the online based freelancer project sites. If you select an agency, they will guide you through their process, and there is little about that for us to write about for you as each agency is different. However, using the Internet-based sites is a little different and often yields excellent results at a lower cost than using an agency would be.

There is many freelance project posting sites out there, below are just a handful of the better known ones:

www.oDesk.com

www.freelancer.com

www.elance.com

When you sign up and make a profile (it is usually free to do so and necessary to browse available freelancers), you will want to browse freelancers by keywords. Use the keywords that best describes your project. In this case I would use GUI (graphical user interface) and Web design. Once you have a few freelancers in mind and have a decent idea of how the site works, go ahead and make a posting for your project. This is like an advertisement for the project itself going into what will be required, how many hours are expected to be put into it, reporting procedure for progress notes and, of course, rate of pay. From here, you can invite certain freelancers to place a bid on the project and leave it open for others to come in and discover what is being offered and set up bids themselves. Set a time limit for bids to be taken (usually anywhere from 24 hours to 2 weeks). Then, when it is done, select those you wish to interview and then who will take on the project. It is as easy as that!

Becoming a Freelancer

Using the same sits we have listed above, visit them and make yourself a profile there. Make sure that you are treating it like a resume, as if this is the first tool that potential employers will see. If it is sloppy or clearly arrogant or bragging, you will not be taken seriously. If you are in a field like writing, programming or graphics design, make sure to maintain a portfolio that prospective employers can examine to help them make their decision. A decent, well-stocked portfolio can make the difference between programming toilet seats to flush automatically and working on the next NASA programming project.

When you are set up, take a look around in your interest areas at the various advertisements for jobs. Make certain that you read the requirements all the way through before sending in a proposal! There is nothing more embarrassing than showing that you are unable to read basic instructions by blundering into the proposal without all of the facts on hand because you did not take 5 minutes to read. Once an employer accepts your proposal, make certain to sit down either by instant messenger, Skype video conference or phone call to hammer out terms and how the work is to be completed. Always stick to a strong working ethic, and you will have few problems to worry about in your new working environment.

Is this how employment will evolve?

It may very well be. This form of employment puts the burden of responsibility on the worker, not the employer. This is one way to make certain that companies keep their costs down while still maintaining a viable workforce. There is nothing saying that, if you enjoy a freelancer so much, renewing their contract many times is not possible. Just remember, however, that they have the ultimate power in the end. They might choose not renewing for any number of reasons…or no reason at all, as is their privilege. This form of employment also brings us to the point of a truly worldwide workforce. There are on the freelancer board people from every country on the planet that has an Internet connection somewhere nearby.

Last but not least, this form of employment allows for employers to switch from large buildings and offices to a small, much smaller place that costs less as employees are often working remotely from home! As time goes on, this is becoming more and more common all over the world. Next time you look for a job ask if they allow you to telecommute! They just might allow it.

Read More

Clickjacking: What is it and How You Can Protect Yourself?

  • Posted On  2015-07-13 10:06:16 by AryanIct.com Blog

Lately, there has been a lot on the news about this type of computer attack called “click-jacking” where, through the use of web pages, hackers are cheating people out of millions of dollars by setting them up with fraudulent purchases as well as data mining their personal information, such as credit card numbers. Unfortunately, this type of attack is extremely hard to trace because of the way it is built to make it seem as if the person who was attacked indeed intended the action taken or the information shared. Thankfully, there have finally been some breakthroughs in finding and arresting those who participate in this awful activity. On November 9th 2011, the FBI shut down a ring of click-jackers who collectively stole over 14 million dollars and affected well over 4 million computers individually.

How does it work?

Click jacking works by hackers creating a button on a web page that does something other than what it is saying it will do. For example, the button could be a simple submit button. However, instead of submitting the information for that newsletter you wanted, you just ordered a 4-year subscription to playboy magazine. It is the art of overlaying an invisible page over the page that you see and collecting information which is then used to defraud you. Some of the tricks that have been used are:

  •     Tricking users to enable their web camera and audio through a flash pop-up (Adobe has fixed this);
  •     Making users social networking profile information public if it was previously private;
  •     Forcing someone to follow someone else on twitter. This is usually someone who posts bad pornography and other things found repulsive;
  •     Forced link sharing on FaceBook and other link sharing networks.

Another way that it works is when hackers are paid for how many clicks on an advertisement that is found on their web pages, or how many times a particular ad is shown. They use a form of malware called “DnsChanger” which depends on subverted servers and a user becomes redirected through infected networks, putting money in the hackers’ pockets and opening up your computer for serious infection.

I have a Mac (Linux, UNIX or other OS). I’m not at risk, am I?

Yes, you are at risk. Because this kind of attack uses the browser as its carrier, anyone can be at risk no matter what operating system you run. Also, since the software that gets installed into your computer from clicking on an infected link or button prevents you from getting to anti-virus sites that would remove it, most users who are not paying close attention would never know that they were infected.

What can I do to protect myself?

There are a few things that you can do to keep yourself safe. First of all, making certain that you are keeping an eye open to the web pages that you get directed to when you click on any links. Make certain that they are within the domain that you expect them to be! For example, if you go to an iTunes website to buy some music, it should read something like store.itunes.com. If you have been click-jacked, it will read something similar enough that you may not notice it unless you read it carefully. So please, keep your eyes open! Also, there are add-ons for your browsers that you can use that, while taking some functionality away, will keep you safe. For Firefox there is NoScript which blocks all potentially dangerous scripts. If you want to see a You Tube video though, you will need to tell the add-on to let you through. It can be tedious, but it is worth it.

One other option that is a bit on the extreme end is to use a text only browser like Lynx. It is exactly what it sounds like it is, a browser that allows nothing but text through. This is a very extreme action and one that is sure to make less of your internet browsing experience, but if you are that worried it is a good idea. Just make sure that the instructions are read through carefully; many users have reported that the program is difficult to get up and running and the developer admits to not having the time to offer technical support.

What are my options for server side protection?

You can protect your website users from click-jacking attacks by using a bit of Java code called a Frame Killer. What this does is stops any of the triggered content from being showed within a frame, which prevents click-jackers from making their move. For those who wish to implement it, a good cross-browser code set is:



By using this, most click-jacking attempts will be thwarted as well as several other types of attacks that rely on frames being used within a website. While this can be reliable in almost all circumstances, it still pays to be as cautious as possible and to urge your website users to install things like NoScript and to use practical sense when browsing the Internet. Such words of caution will help both your readers and yourself by keeping attackers from your site.

What do I do if I think I’ve been affected?

The FBI website has an entire taskforce that is on just this issue. The project is called “Operation Ghost Click” and has materials on their site to help you determine if you have been infected. If after doing this simple test where you put your IP address into their searching box and it turns up that you have been affected, you will be given further instructions on how to file a report and assistance on gaining control over your IP again.

After you have made your report to the FBI, please bring your computer to a computer professional who you trust to remove such malware from your system. Because of the fairly new and complicated strategy that has been taken concerning this attack, users should not take their computer safety lightly. Have a professional help you.

It once again all comes down to being safe on the internet. Keeping an eye to your browser address window and not clicking on things that your gut may be telling you are not right in some way are things that you should follow through. Also, keeping good anti-virus software up to date on your system will help you to remain away from sites that are infected.

Read More

The Resurgence of Apache

  • Posted On  2015-07-13 10:04:14 by AryanIct.com Blog

Evolution occurs at such a lightning-fast pace on the World Wide Web that almost nothing maintains dominance for very long.  Things like Google as a search engine and Flash as a primary video streaming service are the exception rather than the rule.  Even then, Flash just took a severe body blow, as support for it is suddenly being abandoned in the wake of HTML 5?s emergence.

To have any sort of dominance on the World Wide Web for just a year or two is amazing.  That is what makes the run that the Apache Web Server has had all the more breathtaking.  They first hit the top spot in web server technology in early 1996.  They haven’t given it up since.

They did come close recently, though.  Microsoft finally took their gloves off and put real effort into their web server technology.  This resulted in a surge in Microsoft web hosting that, at its peak, gave it a third of all web hosting serve technologies in 2008, just a step behind Apache.

So close, yet…

That surge ran out of fuel, though.  By the time of its November 2011 web server survey, Netcraft showed that Apache’s share of web server software was back up to a dominant 65% of all web sites.  Microsoft had fallen back down to just over 15%, and even Google so far is stuck in low also-ran single digits.  Relative newcomer nginx was third with about 8%.

So why is this dominance so pervasive?  What is it about the Apache web server that gives it such an unshakable place in the web hosting world?  Is there any indication that this will change any time soon?

The flexibility of modules – especially open source modules

Through the use of modules, which are essentially plugins to the Apache web server, the web host is able to configure Apache to their specifications.  These modules allow smooth cooperation with other applications, including other web hosting software packages.  Several dozen modules have been released by the Apache Software Foundation, and several dozen more have been developed independently.

This hints at the thing that gives these modules that bit of extra power: Apache is open-source.  This leads to the usual benefits that attend all open-source packages: individual flexibility, expansiveness of user support, rapid development and bug tracking and fixing, high efficiency, and so forth.

High portability

Another benefit to Apache’s open source nature is that it has been developed for a wide variety of operating systems.  Naturally it works on just about every major UNIX and Linux variant.  But it has also been ported over to Windows, Mac OS, AmigaOS, OS/2, and a few others that you’ve probably never even heard of.  This alone should make it clear why, until they change their philosophy, Windows Server 2008 and whatever variants follow won’t even have a prayer at competing.  It is even available in 10 spoken languages.

Other advantages

  •     Cost: We could have probably put this into the list of open-source advantages above, but it deserves reiteration in its own right.  In addition to all of the obvious advantages that this entails, consider this also: this makes the barrier to entry so low that anyone so dedicated can run a web hosting service from their basement (Indeed, a lot of people do.  Well, some of them probably use other rooms as well).
  •     Specific features: There are a few nicely built-in advantages to the Apache web server that only adds to the above list.  Load balancing is one, which is why most Apache web hosts guarantee such high uptime percentages.  Virtual hosts, meaning the ability to create subdomain.domain.com, is another, and a very popular one.
  •     Security: Again, this almost goes without saying.  It bears asking though: 20 years into the World Wide Web, how many major Apache security incidents can you name?  Can anyone name any at all?  With some software packages and operating systems all but assuming that break-ins will happen, this is no mean feat.

What are the competitors up to?

All of this said, all empires eventually end.  One can certainly not expect either Microsoft or Google to just roll over and play dead.  So what are they doing instead?

Microsoft released this year Windows Home Server 2011, the latest in its own attempts to bring web hosting server technologies to home PCs (reference the basement web host above).  Unfortunately, they shot themselves in the foot right from the start.  Microsoft’s Home Server technology, starting from 2007, had included a feature called “Drive Extender”.  This enabled a few key server abilities, namely multi-disk redundancy, a single folder name space, and the ability to extend storage to any type of hard disk in any combination.

Thus, it was with astonished incredulity from its user base that Microsoft suddenly removed this service, as it was considered one of the server’s main selling points.  The outcry resulted in Microsoft promising to utilize RAID technology instead.  This, however, has not much placated the masses, and third parties have worked to fill the vacuum.  In short: don’t look to Microsoft’s fortunes in this area turning around any time soon.

And the others?

Google, to put it shortly, doesn’t look yet to be really putting that much effort into this yet.  Their focus seems to be using web server technology to serve its own needs.  Granted, these needs are growing with leaps and bounds, but it doesn’t look to be that of all-purpose web hosting anytime soon.

Nginx is the more interesting case.  Most of its market share has been achieved in the last 3 years.  There are specific reasons for this.  nginx can support up to 10,000 simultaneous connections and MP4 streaming.

A hardening monopoly

Nginx has still not yet achieved the broad respect that Apache has.  Then again, it itself is an Apache fork, meaning that most of the former’s advantages are already built into it.  Though that being the case, some would say there’s no point really considering it a “competitor” so much.  Combined, these two own ¾ of the market share, and growing.

In short, as amazing as it may seem, it appears that one of the main parameters of what makes the World Wide Web function, the underlying web hosting technology, is something that is not only not going to change anytime soon but, unless some really meteor of a competitor comes in out of nowhere, is going to solidify even harder.  In a world where technologies change on an almost hourly basis, this is an astounding statement.

Read More

Usenet: The Outsiders Guide to the Internet’s Oldest Community

  • Posted On  2015-07-13 10:00:30 by AryanIct.com Blog



Anyone who has used the internet for more than five years has heard at least once of this thing spoken of in reverent tones called “UseNet.” The problem is, once you find someone who has used this “UseNet” or read a few articles about it, you come to realize that everyone has a different description of the service. Or is it a website? Or is it the internet itself? Let’s take a walk and see if we cannot untangle this mess that is the myth and reality of UseNet.

Usenet is a service, not a website.

First of all, Usenet is a service that is hosted by many computers worldwide. There are several companies and organizations that will archive the messages (such as Google) and other companies that will provide access to Usenet and all it holds. At the very base of it all, Usenet is an internet community held between hundreds of thousands, if not millions, of different people through e-mail like messages being shared within newsgroups. Newsgroup is the proper word for each subset of Usenet. They are not lists nor are the SIGs or just “groups”.

Finding Information within Usenet

When a person takes their first steps into Usenet, it can be extremely overwhelming to find information at first. It takes understanding the hierarchy that the entire system is set up on. First of all, you have your prefixes; for example, Comp.Language.Java is the newsgroup about the computer programming language, Java. There is what has been commonly called “the big eight” hierarchy breakdowns. They are:

  •     comp.* — Discussion of computer-related topics
  •     humanities.* — discussion of the humanities (e.g. literature, philosophy)
  •     news.* — Discussion of Usenet itself
  •     sci.* — Discussion of scientific subjects
  •     rec.* — Discussion of recreational activities (e.g. games and hobbies)
  •     soc.* — Socializing and discussion of social issues.
  •     talk.* — Discussion of contentious issues such as religion and politics.
  •     misc.* — Miscellaneous discussion—anything which does not fit in the other hierarchies.

Once you know what main heading your topic comes under, then you can easily find a group covering the issue you have interest in.

Accessing Usenet

Unfortunately, it is not as simple as just loading up a webpage and reading the information that is there. Some of it may be accessible this way through Google groups and other archive systems; however, the best and most reliable way to access the service is through one of the many high speed connection companies out there. They often come with a newsreader which is a program that will assist you in connecting to the service and searching for the particular information that you were looking for. Speaking of information that you might have been looking for, what brings people to Usenet and what do they share in these newsgroups?

Usenet and Copyright

As we advance into the digital age, we hear more and more often about how copyright laws are being broken and information is being sent to the internet to be downloaded by any who happen across it. This includes everything from music and movies to important government documents that were not well maintained (Wiki leaks, anyone?).  For the longest time, the first rule of Usenet was that you did not speak about Usenet. However, now it is well-known that this service is still the wild west of the internet. Anything can be had here if you know how to find it. News, information, how-to files, movies, music, programs etc. can all be found on Usenet and downloaded to your own machine. Because of how decentralized the entire system is, depending on a network of personal computers and several thousands of archive sites and computer banks that provide the processing power, it is also the single more private place to share files of any type.

Now, of course, I am not suggesting that you should go out and break copyright law. I am making certain that you, the reader, are aware of the types of information to be found so that you can keep yourself away from files that may contain problems for you. There is also the fact that the majority of Usenet groups are not moderated and files that can be shared may contain viruses and other things that are dangerous to your computer. When you go to download anything from there read around the file you have your eyes on, particularly notes from those who have downloaded it before you. If there was a problem with the download, most people are nice enough to actually say something about it. This is not an excuse however to be sloppy about your own vigilance.

Wait, what do you mean Usenet is not moderated?!

Exactly what I said, Usenet is quite literally its own beast at this point. There was at one point a group called “the Backbone Cabal” which was a group of moderators who kept things running smoothly, but never ever did they make an attempt to staunch the flow of information or to protect anyone’s tender sensibilities. In the early 1990’s, they stepped away and stopped being active in the community, seeing that Usenet now has its own life and way of being.

What is its use for the average user?

There are a lot of perfectly legal reasons to be on Usenet. It is an excellent place to learn about topics that you have an interest in, like computer programming languages, or even Pinball (yes, there is a rec.games.pinball newsgroup). It is also a good place to discuss current events and to stay on top of the news of the world. It even has been and will be again (I’m sure) used as a way to organize conventions and gatherings of people who share like interests. So, although there are plenty of iffy reasons to be on Usenet, do not let that frighten you away. Come to Usenet to learn and to grow, share your knowledge with others. That is the reason it was created in the first place!

Read More

Distributed Computing: What is it and What is it Used For?

  • Posted On  2015-07-13 09:58:51 by AryanIct.com Blog



A trend we hear about now and again, mainly in relation to some rather large mathematical projects is ‘distributed computing’. Distributed computing is when you take a large task and break it down into many small parts and have an entire bank of computers working on the project instead of just one. This allows for a project to be completed faster, using the processing power of many computers which are in the bank that is working on the project. Simply put, distributed computing is a way to take a task that would take one computer million years to complete, and instead hand it to a million computers, thus making the task obtainable.

What types of tasks are completed in this way?

One of those more well-known tasks is PrimeGrid, the project which works to find more prime numbers through the use of distributed computing. It is one of the wider projects and open to the public through the use of software called BOINC (Berkley Open Infrastructure for Network Computing). The user can find a compatible project which can be done through the Wikipedia page here:

http://en.wikipedia.org/wiki/List_of_distributed_computing_projects

The user can often download the software from that site, and set it up on their system. From there, the program will proceed to use the computers downtime (that is, the time that the user is not making use of the system and when the system itself is not doing any related tasks, like updating) in order to complete the tasks necessary for the project, often just using the processing power itself if not doing the actual task on the user’s system itself.

Now, before you go and think that this is a project and a program that has little real world use and is essentially the creation of a bored MIT student, you need to understand that this form of computing is actually being funded by some of the nation’s largest science foundations while it works on tasks, such as predicting Protein folding and the finding of prime numbers. In fact, just this past month (October 2011), a PrimeGrid computer and user found the largest generalized Fermat prime number which is 361658^262144+1. Now, this may mean nothing to you, however, it means the world to mathematicians and to those in many sciences that go into protecting our world.

Seti@Home

Seti@Home (Seti at home) was the first significant voluntary, public distributed computing and was popularized by the movie version of Carl Sagan’s fiction book, ‘Contact’. Now, thousands of people are adding their computers downtime energies towards the worthy goal of searching for Extraterrestrial Intelligence. Aside from the goal of observing and reporting on possible intelligence outside of earth, the second goal of the project is to prove the viability of volunteer distributed computing.

This second task they have managed admirably. Many, many projects have moved onward to become distributed volunteer projects and the information gathered from these projects are being analyzed every day and new conclusions about everything from science and health issues to mathematics and interstellar discoveries are being made.

Is this the future of computing?

While this sort of computing bears a passing resemblance to cloud computing and indeed has been misunderstood as such by men, they are not the same. Although, with the power of banks of computers behind a single project, or even that of several super computers, even then, we have a formidable force that we need to take rather seriously. While these sorts of projects did not become really implemented until the early 2000?s (2004 was the beginning of the BIONC project), we do believe that we will see this taken as a more serious way of solving processing power issues, especially at the governmental and defense levels.

Practical uses?

The development of distributed computer networking has led to a fascinating and highly controversial development, Peer to Peer sharing. Now, of course, like with all things, it all depends on how you use the thing; there are legal and illegal ways, always. Breaking copyright laws by sharing copyrighted material with your friends and other network sharing folks would be one example of illegal usage. However, there are many private research projects that have been using this architecture over the past few years and have found it to be useful as well as expedient in their project process.

As far as governmental use goes, it has become well known that most computer nodes that are in use spend 90% of their time in “downtime”, which means that they are not being used as effectively as they could be. By making use of a distributed computing system network, they will take that time and make it useful again by assigning large tasks that may take one computer many hours or days to complete and dealing with the task overnight. It also allows for governments to create peer to peer sharing networks to assist them in sharing information on a hierarchical pattern, thus keeping the security of the system intact and making sure that the information needed is available to those who have access to it, while keeping it away from those who have no need.

In conclusion, the process of distributed computing and the entire field, including distributed programming and volunteer computing projects, will be seeing a renewed interest as processor power on the retail level rises and more users become interested in becoming part of something bigger than themselves. It has been found that those with an interest in the sciences are more likely to become users of the BIONC software and take an interest in a particular project. It is a time and energy saving option for all flavors of users. It is also a method that will make the best possible use of downtime from your computers, increasing efficiency. As more and more agencies make use of the very customizable option for their interests, we believe that we will see an increase in scientific and health related answers.

Read More