Saturday, August 3, 2019

4 JavaScript Design Patterns You Should Know

Introduction

Design patterns help consolidate experiences of numerous developers to structure the codes in an advanced way that meet the issues we are looking for solutions, and gives regular vocabulary used to describe solutions for our issues than depicting the syntax and semantics of our code.

JavaScript design patterns help developers to write organized, beautiful and well-organized codes. Although design patterns, when used can without much of a stretch be re-used, they can never enhance developers, rather they just help them by forestalling minor issues that could prompt major issues on the web application development by giving generalized solutions that are not attached to a particular issue.


They decrease the overall codebase by doing away with unnecessary repetitions, thus makes our code more powerful than the ad-hoc solutions.

The design patterns include the following:
  • Module
  • Prototype
  • Observer
  • Singleton
Each pattern consists of many properties, but the following are the important key points:
  1. Context: Where/under what circumstances is the pattern used?
  2. Problem: What are we trying to solve?
  3. Solution: How does using this pattern solve our proposed problem?
  4. Implementation: What does the implementation look like?
Now that we know what design patterns are, and why they are important, let’s know the important design patterns used in JavaScript.

1. Module Design Pattern

A Module is a piece of independent code so we can update the Module without affecting the other parts of the code. Modules also allow us to avoid namespace pollution by making a separate scope for our variables. We can likewise reuse modules in different projects when they are decoupled from different pieces of code.

Modules are a necessary part of any cutting edge JavaScript application and help in keeping our code clean, isolated and organized. There are numerous approaches to create modules in JavaScript, one of which is the Module pattern.

For Example –



2. Revealing Module Pattern

The Revealing Module pattern is a slightly improved variant of the module design by Christian Heilmann. The issue with the module example is that we need to make new open capacities just to call the private capacities and factors.

In this pattern, we map the returned object’s properties to the private functions that we need to reveal as the public that is the reason it's called the Revealing Module pattern.

For Example –



3. Prototype Design Pattern

As we have already known, JavaScript does not support classes in its native form. Inheritance between objects is implemented using prototype-based programming.

It empowers us to create objects which can serve as a prototype for different objects being created. The prototype object is utilized as a blueprint for each object the constructor makes.

For Example –


4. Singleton Pattern

A Singleton is an object which can only be instantiated only once. It is only possible to make an instance when the connection is closed or you try to close the open instance before opening another one. This pattern is likewise alluded to as strict pattern; one drawback related to this pattern is its overwhelming experience in testing because of its hidden dependencies objects which are not effectively singled out for testing.


For Example –


Conclusion

It is useful for JavaScript developers to use design patterns. Some major advantages of using design patterns incorporate project maintainability and furthermore cut off pointless work on the development cycle. Despite the fact that JavaScript design patterns can give solutions for complex issues, obviously, quick development and efficiency, it is inappropriate to infer that these design patterns can supplant the developers.




Tuesday, July 9, 2019

7 Reasons to Use WordPress When You Redesign Your Small Business Website


Your site serves as the digital storefront for your company or services. If it doesn't look engaging, load quickly or even have a mobile version, customers are going to continue strolling. Numerous small business owners depend on their site to acquire a constant flow of leads and teach potential customers on what they offer.

However, with regard to designing and developing a website, the majority of those business owners figure they can't have every one of the fancy odds and ends that bigger organizations have. Be that’s not true. Why? Meet WordPress.

WordPress is a free platform to design your website easily. It’s commonly referred to as a “content management system” because of its ability to let you easily create and organize all of the pages and media you upload to your site.

As a developer, you should know why it’s the best option for a small business’ website? Here are listed some reasons.

1. You’ll be able to start using your website as a blog

In case you're using a different site to host your blog or, more awful, have no blog still (at least yet), exchanging your webpage over to WordPress will rapidly solve that problem. Not exclusively is the software unimaginably easy to work (including new pages/posts actually takes seconds), however, it's the ideal platform for blogging about your small business.

Simply set up a page on your website committed to your blog (simply call it "blog" or some snappy name that plays off your brand). At that point, you can begin adding posts on that page. A blog is your simplest and best approach to persistently add new content to your website and stay up with the latest with your business.

2. WordPress constantly updates itself

Instant updates mean you can be sure your site's security is always up to date and aligned with the best, most current policies. While some other content management frameworks may expect you to manually check for updates or might be slack on performing maintenance, WordPress takes every necessary step for you.

3. WordPress is open source

Open source" just implies that developers can add to WordPress' software in the form of plugins, themes, and updates. How does that advantage you? The framework is continually improving and showing signs of improvement and new addition doesn't cost you a penny. You can receive every one of the rewards of these upgrades without paying for them.

4. WordPress is SEO friendly

SEO or search engine optimization refers to making your site progressively accessible by search engines like Google and Yahoo. While acing SEO can take some investment of time WordPress offers ways for business owners to enhance their site in the simplest ways possible. Look at the free Yoast SEO plugin, which shows you well ordered how your content ranks and where there's space to improve.

5. WordPress is no newbie

It's been around for over ten years so it's safe to state it's a sure thing. While WordPress (like any CMS) isn't flawless, it's pretty much problem-free. Throughout the years, its designers have had time to work out those little twists and improve, aging the framework into an ageless CMS that all levels of web developers have come to cherish.

6. Coding for WordPress is standard for any web developer

A ton of small business owners hires a web developer who at that point builds a complicated site that nobody else can oversee. That is fine and good if you never need to change your site again - however that is uncommon.

One reason WordPress is so incredible is that it's turned out to be such a popular choice any web developers realize how to code for it. At whatever point an issue shows up that you can't fix, or you choose to redesign your site's look, any developer will almost certainly take care of business.

7. Having a WordPress website puts you in good company

WordPress is extraordinary for small businesses since it has all that you have to make a visually pleasing, fully functional, versatile site, and it likewise offers unlimited possibilities if your business or budget grows down the road.

Regardless of whether you're simply beginning as an entrepreneur or your small business is developing like a weed, you need a site that grows as you do. WordPress gives you that choice.



Thursday, July 4, 2019

The Top 10 Web Application Security Risks – OWASP


Thinking about a new web application that could be the following enormous thing that clients run to? You're probably giving more consideration to the features of this new application as opposed to a standout amongst the most significant parts of it that could send individuals running for the hills: security, or the lack thereof.

The OWASP Top 10 is a consistently updated report sketching out security concerns for web application security, concentrating on the 10 most critical risks. The report is put together by a group of security specialists from everywhere throughout the world. OWASP alludes to the Top 10 as an 'awareness document' and they suggest that all organizations join the report into their processes so as to limit and/or reduce security risks.

Below are the security risks reported in the OWASP Top 10 report:

1. Injection – One of the easiest exploits of any website which has become even easier as more automated tools are developed to make the process faster. It occurs when untrusted data is sent to an interpreter. Prevention of such an attack can be made by setting boundaries on the input data to only accept data within those boundaries, a so-called “whitelist” of data. In addition to the whitelist, try to use a parameterized interface and also escape the input data.

2. Cross-Site Scripting (XSS) – The most prevalent web application security flaw, according to OWASP. It occurs when user-supplied data is not escaped and/or validated. While most prevalent, it is also one of the easiest to prevent. Simply escape data and only accept data that is of the correct type, length, name and/or value, a so-called whitelist.

3. Broken Authentication and Session Management – Custom authentication schemes commonly suffer from such vulnerabilities as these because the development of custom authentication schemes is hard and the developer inadvertently doesn’t account for one or more of the common flaws. The prevention of such flaws can be by forcing developers to use a single set of authentication and session management controls.

4. Insecure Direct Object References – A developer might use the key names of database fields in forms on their web application. The attacker changes the name of the form field with a different value that they may not be authenticated for and submits the data. The prevention of such an attack can be made by simply adding abstraction to the fields of the database so that the field names are not known to the attacker.

5. Security Misconfiguration – The entire “stack”—the technologies that, when stacked up, facilitate an application to function—needs to be secure or else other attack vectors can be exploited. For example, a couple of years ago it was found that MySQL databases would allow an attacker to log in to a database after 255 tries of the password. Any affected database that was exposed to the internet i.e. not filtered with a firewall, could be directly accessed without the need to go through the web application in order to be exploited. Prevention of such attacks can be made by ensuring that both developers and IT staff collaborate and continuously audit their systems for such flaws.

6. Missing Function Level Access Control – A malicious and authenticated user changes a parameter of the URL and is able to gain access to a function of the website because the developer didn’t check to make sure the user should have access to that function. The prevention of such an attack is as simple as making sure that if a function is accessed, the user has the right to access it.

7. Cross-Site Request Forgery (CSRF) – Allowing data to be sent to a server without also including an unpredictable token value that the server already knows would allow such an attack. A user could go from one website to another, malicious, one where the malicious one would send seemingly authenticated data without the knowledge of the user. Prevention of such an attack is easy. Just set an unpredictable token in the session and have the form submit it as another field. Clear the token on form submission so each new submission has a unique value.

8. Sensitive Data Exposure – One of the more difficult exploits, it typically occurs when the developer uses no data encryption, or weak data encryption, or weak keys and/or weak encryption and hashing algorithms. As technology advances, hashing and encryption algorithms should be used and kept updated to the latest versions.

9. Unvalidated Redirects and Forwards – An attacker can trick a user into clicking on a link that contains an unvalidated parameter, which allows a redirect to a page with higher administrative privileges. Prevention can be not using redirects/forwards; if redirects/forwards are used, then validate the redirect parameters and ensure the user has the appropriate privileges to access the redirected/forwarded page.

10. Using Components with Known VulnerabilitiesDevelopers may not check to see if components, such as plugins, have known vulnerabilities before using them in a web application. As vulnerabilities are commonly widely published as a means to inform developers, attackers can use the same published data to create attack vectors for web applications. Prevention includes checking public information to see if any components of the web application are vulnerable; create security tests for the components that scan for common vulnerabilities; add an abstraction layer to the component to restrict what data can be input and output by it.

What’s next for Developers?

Regardless of whether you're new to web application security or already acquainted with these risks, the task of delivering a safe web application or fixing an existing one can be troublesome. If you need to manage with a large application portfolio, this task can be overwhelming. To support organizations and developers diminish their application security hazards in a cost-effective manner, OWASP has created various free and open resources that you can use to address application security in your organization.

OWASP recommends organizations set up an application security program to pick up knowledge and improve security over their applications and APIs. Accomplishing application security requires a wide range of parts of an organization to cooperate efficiently, including security and review, software development, business, and official administration. Security ought to be visible and quantifiable, so all the various players can see and comprehend the organization’s application security act.

Friday, June 14, 2019

WordPress - When does it run out of legs?


When a developer needs to get a site going quickly then WordPress is a standout amongst the most well-known development stages. The adaptability that it permits using hooks and filters, and the moderately short expectation to absorb information when compared with similar platforms like Drupal, makes WordPress the multi-tool of site platforms, especially for designers simply beginning. Be that as it may, when is sufficient, enough; when does WordPress run out of legs?

One of the most concerning issues with WordPress is that when you activate more than around 25-plugins, it will slow down your site performance. As your new professional server admin (~$100,000/year) presently chooses to include a load balancer ($20/month; Linode), increment the quantity of back-end PHP server nodes (somewhere in the range of $5 and $80/server/month; Linode), and break the database onto its very own server (somewhere in the range of $60 and $960/server/month; Linode), the genuine cost of using the open-source WordPress platform as a multi-tool comes into view.

Performance: isn't there a plugins for that? There are, in fact, various plugins that display a shroud of taking care of the majority of performance problems, yet we realize that mitigating performance issues from the application layer, where said performance issues are caused by other plugins on the application layer, is simply including more performance issues than what we began with. If your server ran out of memory since it's serving such a large number of simultaneous requests, at that point including another plugin on the application layer that requires a similar server assets to function won't help.

Yes, more plugins will include more code that a browser has to load, which can slow it down. What's more, to paint with general terms, a site with a more straightforward codebase will usually load faster than one with a complicated one, every single other thing being equivalent.

There are four major areas we should look when evaluate adding a plugin to the website:
  • The plugins that load a lot of scripts, styles, or other assets on pages.
  • The plugins that add extra database query to each page.
  • The plugins that need to perform complex operations.
  • The plugins that perform a lot of remote requests.

Here we can solve the performance issues on the application layer when we depend on the plugin and platform developer for changes.
1.   We can choose to assume control over the plugin ourselves, in this manner expelling the abilityfor simple plugin updating as our version of the plugin won't be equivalent to the original author’s (Requires a dedicated developer (~$80,000/year and possibly a designer ~$52,000/year));

2.  We can lobby the plugin author, or even attempt to add to the project, so as to make performance based changes (Requires time ($$$) and a developer, ~$80,000/year);

3. We can invest time searching for a similar plugin that perhaps has better performance measurements (Requires time ($) with no guarantee of a superior outcome);

Time is money and one of the main reasons why we pick WordPress as a development platform is because that we trust the deception that we're getting something for alongside nothing. Experienced developers and designers can setup a WordPress site in less than 15-minutes and presumably have the full application with theme finished in less than 10-days, without considering the way that we will finish up paying for this decision later on, not simply from a performance viewpoint. At the point when the plugin author chooses to make changes which will break other plugins that rely upon one major one (read: the famous WooCommerce), we may pay most beyond all doubt from the loss of income and a bruised brand from a broken website.

WordPress isn't all awful, especially if you find a development firm who truly understands how to incorporate functionality without relinquishing performance and that can serve a WordPress application efficiently, yet it will spare us a lot of time and headache if we make a step back and consider if there is another platform that will be a superior met for our requirements as opposed to simply naturally accepting WordPress as the solution to every problem.

Wednesday, May 29, 2019

Top Trending Technologies Expected to Acquire a Huge Market in 2019

Change is the main steady. This applies in your professional life also. Up-scaling yourself is a need these days, the reason is truly basic, and technology is developing in all respects rapidly. Here is some one of the trending technologies, which is expected to acquire a huge market in 2019.

1) Artificial Intelligence (AI):

Artificial intelligence (AI) causes it workable for machines to learn from fact, change in accordance with new data sources and perform human-like errands. Most AI precedents that you find out about today – from chess-playing computers to self-driving vehicles – depend intensely on profound learning and normal language processing. Utilizing these advancements, computers can be prepared to achieve explicit undertakings by handling a lot of information and perceiving designs in the information.

Why is Artificial Intelligence is Important?
  • AI automates repetitive learning and discovery through data.
  • AI adds intelligence to existing products.
  • AI adapts through progressive learning algorithms
  • AI analyzes more and deeper data.
  • AI achieves incredible accuracy.
  • AI gets the most out of data.



2) Blockchain:
  • This is the tech that forces bitcoins, the totally different parallel currency   that has assumed control over the world.
  • Strangely, blockchain as an innovation has sweeping potential in everything from healthcare to financial world to land to law enforcement.



3) Cloud Computing:

Cloud computing is that the conveyance of computing services—including servers, storage, databases, networking, software, analytics, and intelligence—over the Internet ("the cloud") to offer quicker advancement, adaptable assets, and economies of scale.

Uses of Cloud computing - 
  • Create new apps and services
  • Store, back up, and recover data
  • Stream audio and video
  • Deliver software on demand
  • Test and build applications
  • Embed intelligence


4) Angular and React:

  • Angular and React are JavaScript based Frameworks for developing modern web applications.
  • Using React and Angular one can make a profoundly secluded web application. So, you don't have to experience a lot of changes in your code base for including another feature.
  • Angular and React likewise enables you to make a local mobile application with a similar JS, CSS and HTML knowledge.
  • Its best part is Open source library with highly active community support.


5) Big Data:

Big data alludes to issues that are related with handling and storing various sorts of data. The greater part of the organizations today, depends on big data analytics to increase enormous insight about their:
  • customer,
  • product research,
  • Marketing initiatives and many more.

Hadoop and Spark are the two most popular frameworks for solving Big Data problems.


So, we seen that the technologies is changing rapidly and Arna Softech have passion for technology. We trust that it is an absolute necessity following new trends and innovations in the software world.


Wednesday, May 22, 2019

Content Marketers Should Know More About Search




In contrast to content marketing, SEO is an extremely new discipline. It has been around for only a couple decades, and it rapidly turned into the primary digital focus for many marketers.
However, SEO can't exist without content. Presently, with SEO developing so rapidly, there are still numerous misguided judgments and misunderstanding around it. Those misinterpretations may affect the content marketing process in a not so positive manner.

Let’s clear things up a bit:

1. SEO Has Become Much More Integrated and Different

Many SEOs would ignore very important digital marketing perspective, including user experience, brand building, etc. The main reason for existing was to get a page ranked in search engines.
Nowadays it's finally different: SEO is only one component of achievement. It's next to difficult to accomplish high rankings without building authority and brand awareness, or without ensuring users will have a better experience using the website. Google has taken the majority of that in the record: they monitor how users interact with a site, how fulfilled they are, and how rapidly they find answers when landing on a page from search results.
Most companies are offering full-package internet marketing services that include video production, social media marketing and usability. Some companies even go beyond that by giving “integrated marketing services”. 

2. SEO Is No Longer Focused on Exact-Match Strings

Remember the days when writers were given one phase and compelled to use it a specific number of times in content?
Well, those days are joyfully finished.

Search engines have moved past supposed "keyword strings". They would now be able to get ideas, entities and topics. After updating in algorithm, now Google understands all kinds of phrases that can satisfy the initial query of the user and focuses on the quality of the results rather than matching the exact match strings. Quality and depth of content have turned out to be considerably more significant than the exact keyword you put on the page. When you start working on content, make sure you understand related subtopics and subcategories that need to be included in your site or article.

3. Search Gives Us Lots of Cues

Search has developed. Google has become to be more brilliant at distinguishing search intent and giving their users precisely what they need. They have turned out to be better at distinguishing peoples' struggles and serving the best answer within search results. They have figured out how to discover inquiries behind questions and show their users more options for researching a topic.
The fact that all of that comes up in search results makes it workable for writers to become familiar with any point they are writing for. The key is to figure out how to see and interpret those cues to make progressively valuable and better-optimized content.
Let’s see how it gives cues:

1) When searching, look at all kinds of search results that come up.
·       Is there a video carousel? If you found videos search result that means Google has found users engage with videos more, so maybe you need to put video for that particular keyword.
·      Are there image results? It means Google has seen its users look for visual content when searching.
·       Are there shopping results? This signal of high commercial intent, so your article may not do so well here.

2) When searching, check out “People Also Ask” results.
Google's "People Also Ask" boxes show prominent inquiries based on your query. These give a goldmine of content motivation. Click on some of those questions to see more questions.

3) When searching, pay attention to Google’s Featured Snippets.
Google has gone far at figuring out how to see any web copy and extracting useful information.
·        Define concepts
·        Focus on facts and numbers (e.g. if you are describing a tool, explain its pricing)
·        Use subheadings (especially if you are using questions from the step above)

Nowadays, rather than compelling artificial copy, Google improves your content by instructing you to research more, structure better and use a more varied vocabulary.


Wednesday, April 3, 2019

How to Build Apps Faster using Redis Cache in Node.js


NodeJs core is the speed with respect to execution of async tasks, however there are ways to improve the response time further by using Redis Cache. I would try to get into concept of Caching and approach to implementation.

Redis is an open source (BSD licensed), in-memory data structure store, used as a database, cache and message broker. It supports strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, geospatial indexes with radius queries and streams. Redis has built-in replication, Lua scripting, LRU eviction, transactions and different levels of on-disk persistence, and provides high availability via Redis Sentinel and automatic partitioning with Redis Cluster.

What is Caching?

Caching is the process of storing data in a temporary data store where data is kept for later use. This temporary data store is called Cache.

A cache as a data store is easier for the client (or server) to reach, as opposed to a permanent data store that might be located on a different service, which takes more time and resources to reach (a database or an external API endpoint).

First step in application optimization is implementation of caching. Caching involves storing data in a high performance store (many times temporarily) so such data can be retrieved faster at a later time. Redis is an efficient key-value data store that has become a preferred choice for caching. Redis holds edge over other options because of the number of data structures it supports — such as strings, hashes, lists, sets, and so on. This provides lot of flexibility! 

Plumbing Node.JS and Redis

Let’s first install the Redis to understand the impact of using it as an in-memory data structure for caching on a node server. On InstallationRedis server is started, it runs on default port 6379.

Here, we will fetch the country codes from the existing application that seldom changes and cache them on the node server. To use Redis for caching, we will need Redis module (Redis client) in the Node.JS application that will help in connecting with the Redis server.

The Redis client would help in storing and obtaining the data from the Redis cache.

When the data request is made by the Node.JS app, the products API before moving to the Node server to obtain the data, at first check the data in the Redis. In case, the data is available in the Redis, then it’s fetched from the memory and sent as a response to the request. Otherwise, products API move to the “some_other_domain” to fetch the data and from where the response is firstly cached in the Redis and then send as a response to the request.
This way the response time get drastically reduced.


In Node.JS app, when 1000 records are fetched from the legacy application, it took 8 seconds.

When the requested data is cached on Redis and again requested, the response time reduced to 50 milliseconds.

Furthermore, if database result sets are cached in Redis, then APIs performance would also increase. The cache can be automatically cleared by setting its expiration time with the ‘setex’ method.

Epilogue

Leveraging the power Redis, the Node.JS applications can create a major breakthrough for enterprises. The strong caching layer of Redis is making the apps respond quickly.


Aforementioned instance elaborated the impact of Redis on Node.JS app that needs no explanation.

Although Redis has a memory limit where the data equivalent to the drive capacity can be stored. Also, Redis is worth to use when access to loads of data is barely needed, but in case, frequent access is needed, then it proves to be quite expensive.

Redis is a magic bullet that must be used with Node.JS app strategically depending upon the type of data and access you need.

Example:
We will make a comparison of 2 pieces of code from Node.js. The following is when we try to retrieve data from the Google Book API, without put Redis on the endpoint.

Node.Js without Redis :

  1. ‘usestrict';
  2. //Define all dependencies needed
  3. constexpress=require('express');
  4. constresponseTime=require('response-time')
  5. constaxios=require('axios');
  6. //Load Express Framework
  7. var app =express();
  8. //Create a middleware that adds a X-Response-Time header to responses.
  9. app.use(responseTime());
  10. constgetBook= (req, res) => {
  11. letisbn=req.query.isbn;
  12. leturl=`https://www.googleapis.com/books/v1/volumes?q=isbn:${isbn}`;
  13. axios.get(url)
  14.     .then(response=> {
  15. let book =response.data.items
  16. res.send(book);
  17.     })
  18.     .catch(err=> {
  19. res.send('The book you are looking for is not found !!!');
  20.     });
  21. };
  22. app.get('/book', getBook);
  23. app.listen(3000, function() {
  24. console.log('Your node is running on port 3000 !!!')
  25. });

Node.Js with Redis :

  1. 'use strict';
  2. //Define all dependencies needed
  3. constexpress=require('express');
  4. constresponseTime=require('response-time')
  5. constaxios=require('axios');
  6. constRedis=require('Redis');
  7. constclient=Redis.createClient();
  8. //Load Express Framework
  9. var app =express();
  10. //Create a middleware that adds a X-Response-Time header to responses.
  11. app.use(responseTime());
  12. constgetBook= (req, res) => {
  13. letisbn=req.query.isbn;
  14. leturl=`https://www.googleapis.com/books/v1/volumes?q=isbn:${isbn}`;
  15. returnaxios.get(url)
  16.     .then(response=> {
  17. let book =response.data.items;
  18. // Set the string-key:isbn in our cache. With he contents of the cache : title
  19. // Set cache expiration to 1 hour (60 minutes)
  20. client.setex(isbn, 3600, JSON.stringify(book));
  21. res.send(book);
  22.     })
  23.     .catch(err=> {
  24. res.send('The book you are looking for is not found !!!');
  25.     });
  26. };
  27. constgetCache= (req, res) => {
  28. letisbn=req.query.isbn;
  29. //Check the cache data from the server Redis
  30. client.get(isbn, (err, result) => {
  31. if (result) {
  32. res.send(result);
  33.     } else {
  34. getBook(req, res);
  35.     }
  36.   });
  37. }
  38. app.get('/book', getCache);
  39. app.listen(3000, function() {
  40. console.log('Your node is running on port 3000 !!!')
  41. });


You can see, the above code explains that Redis will store the cache data of with unique key value that we have specified, using this function:

client.setex (isbn, 3600, JSON.stringify(book));

And take the cache data using the function below:

client.get(isbn, (err, result) => {
     if (result) {
res.send (result);
     } else {
getBook (req, res);
     }
   });
This is the result of testing of both codes. If we don’t use Redis as cache, it takes at least 908.545 ms



Very different, when node js use Redis. Look at this, it is very fast, it only takes 0.621 ms to retrieve data at the same endpoint



Conclusion:

I know there are still many ways to improve and speed up performance on Node.js and only one way above, I think not enough to make your Node.js run perfectly. Please let me know another way, by filling in the comments.