We use cookies to enhance your experience. Dismiss this message or find out more.

POWERED BY:

Here is the forecast for today

List

Why Administration? March 22, 2018

Author: Akeem Spencer

Automation is essential to health care in the 21st century. Physicians and nurses choose these professions primarily because they wish to care for suffering humanity and provide a cure for diseases. Currently, these professionals are burdened by the administrative requirements of paperwork completion, which keeps them away from their patients. Stakeholders, physicians, and nurses are desperately looking for solutions. Could automation be the answer? <br/> Let's do a google search with the keywords "AI", "Machine Learning", and "Physician Ward" and let's collect the top results crammed at the top of the search engine. ![search bar on google][1] ![Articles Of Results][2] <br/> I've read the top five articles on the results feed and the central theme across all of these articles are the following: Many deep-learning training sets follow a causation problem. This causation problem is symmetrical to the self-titled "black-box" problem with artificial intelligence. Deep-Learning is becoming more and more accepting of solutions for problems businesses seem to face. But can LSTM machines supplement, augment or even replace the hired administration staff within a typical clinical environment? Another key trend across article-to-article are the collaborations between market sectors. Government, corporations, and physician wards are growing "diagnostic tools" to help detect onerous diseases and cancerous cells residing in patients. But if we are using AI to be the storefront for analysis based on medical images, can we use AI to break the mold on scheduling, admission, and customer service for patients across the United States? This is an open forum. Feel free to discuss and share your opinions below. <br> [1]: https://spencertechconsulting.com/media/search-bar.JPG [2]: https://spencertechconsulting.com/media/Link-Descriptions.JPG

View


Redis for SSO legacy systems June 28, 2017

Author: Akeem Spencer

Greetings to all of you reading this in the summer of '17! <br> <br> I haven't posted anything in the month of June, but I've been very proactive within my company, binding some contracts to keep the lights on. This month I've immersed myself in the healthcare arena by understanding a little more what Artificial Intelligence is doing for its respective field. Along the development phase for this project, a project I cannot publicly acknowledge and disclose, I've had the opportunity to utilize the Redis-Memcache database within my Raspberry PI. <br> <br> Setting up a system development life cycle for this project is not an easy task, considering I had to recreate and mirror the local hospital architecture here in NY. Resources limited, the project will screen users with a login authentication module before the administrators can log in to see their own dashboard. The model replication of the hospital architecture piggybacks on the SMART FHIR protocol for the patient & admin. object fabrication. Problem is, the syntax and object notation descriptions are highly verbose and fluffed up, so I'm going to store the model object data in JSON notation. I didn't want to make the sandbox database on the project muddled and hard to interpret, so I decided to use an open source PostgreSQL database server because of my comfort. The PG instance would also live on the Raspberry PI disk space and my virtual cloud on Digital Ocean would serve as the server instance for the software application I'm scaling. <br> <br> For the window display, I went the Tcl/Tkinter route to create the object for the login screen and as I was writing the code it dawned on me how fast the application should run from sending pertinent, patient information to retrieval of patient information in an encrypted manner. I wanted to see just how much time I'd skim off the login phase if I used a Memcache DB wrapper for Postgres. I remember spending odd nights in NYC going to these meetup sessions where people were utilizing Rabbit, MongoDB and Redis. Redis is a rather new platform and interface and I already have a server running on this app I'm serving the web pages to you on right now! So I went ahead and downloaded the tar file and made the configurations in my virtual session. <br> <br> At the concept development phase, the overhead and customization for the Redis server were pretty intuitive and I stored the binary executable for the Sentinel, server, and client within the local virtual environment of my Debian os. The ecosystem of the Redis shell is very green and doesn't manifest into a heap of crap on my hard drive. <br> <br> The Redis configuration is simplistic to operate with just a quick boot within the local bin folder by pressing: `./redis-server`. Disregarding the warning signs, my server instance is prepped and ready to go! <br> <br> <br> ![REDIS-START-IMAGE][1] <br> <br> <br> So as I was setting up the login screen, I wanted to test the speed and the lines of code it took to enter the user's dashboard. With the Postgres test the following results were discovered using a clock from start to finish: <br> <br> <br> ![pg][2] <br> <br> <br> Versus, the Redis performance time it takes to index the table and retrieve the information from the user: <br> <br> <br> <br> ![redis][3] <br> <br> <br> And here's the code. Slightly different block segments for db authentication, but does the same thing: <br> <br> <br> ![CODE-FOR-DUMMIES-JUST-KIDDING-UZ-SMART][4] <br> <br> <br> Now, of course, these values weren't applied to an ANOVA test case to ensure the confidence interval of these values is stellar, but from the samples collected it seems like 50% decrease in authenticating a user seems worthwhile. Especially in a hospital environment where user communication is paramount. <br> <br> The invocation for the next phase is to emulate the destop window of 'Patient Perfect' EMR system and gain a grant to finish the project with the skills I've accumulated being self-taught. <br> <br> This week marks a year into my freelance venture and it's been upscale since last year. If you have a software or hardware idea you want to bring into fruition, hit us up! We're looking for interns to work with as well. No idea is foolproof, but no idea shouldn't fall on deaf ears. Voice your opinions and translate them into products. We're ready for you! <br> <br> [1]: https://spencertechconsulting.com/media/redis-server-startup.JPG [2]: https://spencertechconsulting.com/media/pg_login_time_encapsulation.png [3]: https://spencertechconsulting.com/media/redis_login_time_encapsulation.png [4]: https://spencertechconsulting.com/media/redis_vs_pg.JPG

View


Key Takeaways From The Design 2 Parts Expo May 2, 2017

Author: Akeem Spencer

Hey guys, So I wanted to give you my thoughts and fan-favorites from the Design 2 Parts expo held outside of the city of Philidelaphi in Oaks PA. While I myself may not be involved in a production/facility environment, my business is beginning to flourish a tad bit and I wanted to see the 3D printer inventors & hackers who are mastering the art of independent plastic extrusion parts for the hobbyists like myself. Because of my prior experience as Manufacturing Engineer, I had to stop at some of the other professional experts' booths who were showcasing OEM products. These services of the products included: - CAD/CAM Software - CAD Design Services - Gaskets, Springs, Retaining Rings, & Wave Springs - Plastic Parts & Components - Rubber Molding - Fabricators The expo center was a two-day event so I decided to stick with Thursday afternoon's session. I had my engineering badge and I highlighted a list of booths I had to visit, making 'disruption' on the Mid-Atlantic tech sphere. The direction of rows and booth numbers were out of chronological order, but I was able to target the companies I circled in my directory listing. ## Plastics & Components ---------- <br> <br> I visited Fisher tech's platform and I was blown away by a number of complex PLA, ABS, ASA, & nylon materials they can composite with the floor model 3D printer. The demo unit (I believe, correct me if I'm wrong), was the Stratasys F170 fused deposition modeling 3D Printer. One of the reps, Lisa Hannon, showed me the array of molded parts they can smoothly mold from their new deposition machine. The more complex parts had me stunned and I'm sorry I don't have enough pictures of the pieces which were fabricated (it may be due to the fact of trademark and proprietary materials on the floor, but oh well it's 2017, people have camera phones these days). The products ranged from stiff plastic pieces to flexible elastomers on the keyring. The bulky, but rapidly quick turnover machine can automate gas pedals to a/c rotating ventilation fans that are actually streamlined into production for the Boeing 777 aircraft jets in the fuselage. Stratasys' own Fisher UniTech is trailblazing as it looks to become an OEM seller to Siemens and Ford Motors. ## Springs & Things ---------- <br> <br> For the complexity and challenges of finding a valuable vendor for fasteners, Smalley is doing their thing. I see the company heavily advertised on Machine Design & NASA print publications, so I had to stop by the booth. They handed out to the buyers and engineers a bag of goodies and here are some of the pieces below: ![5pics][1] <br> <br> Above you can see some snapshots of the different varieties of coils that Smalley manufactures. For instance, some of the drive shafts for linear instruments require a retainer ring to keep the bearings centered and unable to shift during angular torque momentum. Below is a video showing some ways to insert bearings into the hollow area of the shaft: <br> <br> <iframe width="560" height="315" src="https://www.youtube.com/embed/3h0IPS_zkhQ" frameborder="0" allowfullscreen></iframe> The company is highly standardized when in comes to the various spring locks they fabricate and I urge anyone who's in the manufacturing environment to check out [Smalley Spring Locks][2] &#xae; own Spirolox retaining ring that gives conformance to any designer working in critical situations for advanced difficult enclosures for components huddled together in tight spaces. <br> <br> ## Adhesives ---------- <br> <br> Paratronix is like a breath of fresh air for parts with smooth finishes and highly classified silk-like parts developed for engineers. Paratronix develops adhesives and has been in the coating services business for 35 plus years. The pieces were odd yet practical, with some of the materials extremely smooth and colorful. If those of you have a hard time picturing what Paralyne n material is applied to, visit this [website][3]. These benzene &#x232c; classes are used in every functional organization, from the military, medical, and electronics industries. The chlorine branch gives the electrical properties and physical properties and produces a low gas permeability constant. The corrosion index also is low due to its unique structure. Paratronix also sells Paratronix V494 & H1092 model Parylene Conformal Coating Systems. These systems can complete cover electronic components such as ferrites, magnets, RFID Tags, O-Rings, & ceramics. Apologies for the lack of visuals, but check them out online for quote estimates for your design. ## **The Favorite Supplier Is......** ---------- <br> <br> This company represents the primal opportunity of exposure, material development, brick & mortar hustling, & independence. Ultimaker-sponsored "Printed-Solid.com" was giving a demonstration of wheel-plastic development on the Ultimaker 3 printer. The cost and maintenance are flexible for designers & AutoCAD designers so you don't have to bend the budget. This 25-pound machine is highly-favored across the UK all the way to Bangkok, Thailand. Here's a video underneath of the technological exposure in the UK at Imperial College in London. A few of the artists use the additive manufacturing printers for their own compositions and work for the prototype to exhibit to the judges. Try and see if you can spot the printers on some of the workbenches below: <br> <br> <iframe width="560" height="315" src="https://www.youtube.com/embed/fB53-pcwPNQ" frameborder="0" allowfullscreen></iframe> The CAD software to produce the parts have a very, very low learning curve and most of the documentation is free and open-sourced on GitHub. Check out the REPO called [CURA][4], which is always discussed in the issues tab and members can easily send commit changes without the extensive hassle of having to sign release forms for merges to occur(I think it's under the APACHE 2 license so you're free to hack away). ## **Epilogue & Observations** I loved every minute of being able to see the rising talents in the STEM field. The only issue I had at the forum was the frequent barrage of sales personnel and the informal "QR" scan to exchange contact information between reps. Also, there is a clear lack of diversity within the industries & I know my social media atmosphere is made up of brilliant, young entrepreneurs. So if you're unemployed or seeking an opportunity to utilize your B.S., come join us and use your technical acumen to make a mark on this growing and ever-changing industry. With the New York tax-rate dropping to historic lows, more employers are realizing how vital it is to assemble and build close to the East coast mecca. You also have to take into account new federal initiatives being placed into Congress like the Trump-Proposed "By Americans for Americans". It's a great opportunity to remove the rust off your calipers and get back to manufacturing and patenting some work you contrived. I'll definitely will see myself going next year. Thank you for reading my post. Akeem Spencer [1]: /media/spring_collage.JPG [2]: http://www.smalley.com/retaining-rings/spirolox [3]: http://www.paryleneengineering.com/why_use_parylene.htm [4]: https://github.com/Ultimaker/Cura

View


Have Your DB and Web App In Separate Droplets April 5, 2017

Author: Akeem Spencer

Last week I realized with the onslaught of sshd-ddos and sshd attacks coming from China, I've decided it's best to transfer my postgreSQL server to a completely separate host region. When logging into my user account, I can open my mailbox and could never fail to see 100+ replica messages looking like this: ![China Attackers][1] Inside the message, these provinces are "attacking" me: When logging into my user account, I can open my mailbox and could never fail to see 100+ replica messages looking like this: <br> <br> <br> ![China Attackers][2] Inside the message, these provinces are "attacking" me: <br> <br> <br> ![regional attack message][3] <br> <br> So with this barrage of attacks, I've decided to move my postgreSQL server to another droplet, hosted by Digital Ocean. Before you start tampering with the droplet, take notice with how you can setup another database in a private VPS with these series of steps. # Step 1: Choose Your Image: <br> I didn't know exactly what remote OS my website should run behind but initially I picked the $20 per month plan. review the plan that's suitable for your corporation, company, etc. <br> <br> ![digitalocean plans][4] <br> <br> Once you've selected a plan that feasible for you as a developer, please check the virtual private networking, snapshot A.KA. the diff tool plan, and I.P.V.6 for the additional extensions to use later on during the configuration of the droplet. I'd suggest renaming it to the DB you currently have your website in conjunction with (Redis@host, mysql@host etc). Don't forget to check the backup feature for the droplet, it will help considerably if things go wrong in the future! # Step 2: Install & Configure The Host: If you want me to send you an email tutorial for your DB of choice, hit me up on aspencerpsu@gmail.com. Otherwise, this is going to reference PostgreSQL. If you have high-speed internet connection, I suggest configuring the droplet on Digital Ocean's builtin terminal. If your web speed is shotty, then ssh into the terminal. Having ssh RSA keys saved onto your personal computer is slightly unadvisable and doesn't truly take into effect wanting a secure background so if it's unavoidable, go for the console, otherwise, use this command AFTER the configuration goes smooth: > `ssh-keygen -R (your host key)` Install the following packages onto the root: root@myhost:~$ sudo apt-get install python-dev python3-pip \ python3-dev libpq-dev postgresql \ postgresql-contrib The following packages install the developer tools for PG (PostgreSQL) and the cluster commands for the PostgreSQL which makes the database transfer of tables easier and more methodical. Next, open the postgresql.conf file and change the following highlighted line: <br> <br> ![pg_host_based_file][5] <br> <br> Your original host environment can now connect to your remote server with the configuration as displayed above. For extra security purposes, you can use your IPV6 address instead. You'll also have to change the main `postgresql.conf` file in the same server directory: ![postgresql_configuration_file][6] I've included the file location parameters for two specific reasons: - It's the default location - Change it for obvious reasons Change the file location and have it buried two directory levels down from the original location to keep the direction unaware to your staff. Remember you're dealing with personal data where people and not even you yourself should have access to (superusers included). Change the `listening_addresses` of the socket to the private IP address you've created with the droplet generation. # Step 3: Change Django's setting's file: Change the DATABASE parameter as shown below: <br> <br> ![django_conf_file][7] <br> <br> Use the same private ip address you used on the `listen_address` file. Send a SIGHUP to the postgresql configuration file and SIGHUP to the and forward proxy engine you're website is rendering the request and voila! You should be in the clear for a more secure and comfortable system for everything to work in conjunction. <br> <br> # Overview # ---------- This is post is targeted for the at-home bloggers and web-devs doing everything themselves. If your infrastructure begins to flourish and grow exponentially, it's highly desirable to change your configuration management to a third party company like ansible, chef or puppet. Keep looking out for more posts from me and if you're a dev like me always show others how to protect themselves online and share these tips to others. If you have any problems, send me an email at aspencerpsu@gmail.com Thank You, Akeem Leighton Foster Spencer [1]: https://spencertechconsulting.com/media/failure_messages.JPG [2]: https://spencertechconsulting.com/media/failure_messages.JPG [3]: https://spencertechconsulting.com/media/chinese_hackers.JPG [4]: https://www.digitalocean.com/assets/media/homepage/create-5fe2f870.gif [5]: https://spencertechconsulting.com/media/pg_hba_conf.JPG [6]: https://spencertechconsulting.com/media/postgresmainconfig.JPG [7]: https://spencertechconsulting.com/media/django_conf_file.JPG

View


HTTP to HTTPS With LetsEncrypt (Using Django, Nginx, & Gunicorn) March 14, 2017

Author: Akeem Spencer

So you have a website you've just created from scratch, but need to have a secure connection. Furthermore, this website needs a mandatory CA lock to exchange ideas, messages, and currency transactions from customers or other businesses. This post will go through encrypting your site with an api called letsencrypt that was released to the public December of 2015. This post covers the process of configuring your Django & Nginx server while transitioning from a root HTTP to HTTPS header on the root of your domain. Having a secure website not only embellishes great security practices, but it creates the ability to share your content through other mediums. For instance, this posts' url can be publicly shared and forked thru twitter.com through its card-reader. The card reader bot authenticates the url to determine if your site is eligible for sharing through the document's meta tags: ![card_being_read][1] ###### usage of twitter's card reader: [twitter's card reader][2] Or you'd like to secure PayPal business transactions. To reaffirm transactions go smooth follow this tutorial to have your site secured if you are running a Ubuntu 16.04 environment or Linux-Debian platform. The following assumptions are taken into consideration: - Your domain host is valid and registered through a hosting provider like Go Daddy, NameCheap, HostGator or another trusted ISP grantor. - On my VPS, I'm using Nginx as a reverse proxy and Gunicorn as worker-server for the front-end followed by Django as my Database-to-HTML rendering machine. This tutorial will go through getting your certs configured on Nginx instead of Apache2 which is slightly unconventional. You can follow [Khophi's][3] excellent blog walkthrough to have you up to speed with a web-server of course if you're comfortable with PostgreSQL and Python. - At least 35 MBs or higher of memory for installing the letsencrypt package. Also, make sure the "a" records are pointing to the IP address or the IPV6 host address. Otherwise, let's begin: # Step 1 Install letsencrypt: ---------- For the ubuntu, we first must update all the packages so we have no versions incompatible with the incoming package. The SSL repository `letsencrypt` can be acquired through the package manager, inputting the following command: (venv)/home/user$ sudo apt-get update (venv)/home/user$ sudo apt-get install letsencrypt If you feel like taking ownership for installing the package manually you can download the source file from [here][4] and setup the package file with the following command (venv)/home/akeem$ python setup.py install --user Make sure the user has `sudo` admin privileges before getting into trouble. # Step 2: Obtain a SSL certificate: ---------- I have my remote VPS on digitalocean and their [tutorial][5] recommends using the webroot plugin to safely store the keys. `letsencrypt ` has a standalone feature where we can input the user's credentials to apply for the certification. Input the following command to continue on creating SSL CA file: (venv)/home/akeem$ sudo letsencrypt certonly -a standalone Your screen should automatically prompt you to a screen where you can file your email address and the root's domain address like below: ![CERTIFICATION][6] ![domain names][7] But please make sure you're using the same hostname as the one linked to the record and configured under the server's location root path. The process can be further automated by using the `letsencrypt-auto command`, read the manual for it for better understanding on which flags to use for certification. Once the authentication is completed, there should be a successful STDOUT to the terminal like this: IMPORTANT NOTES: If you lose your account credentials, you can recover through e-mails sent to sammy@digitalocean.com *Congratulations! Your certificate and chain have been saved at /etc/letsencrypt/live/yoursite.com/fullchain.pem. Your cert will expire on 2016-03-15. To obtain a new version of the certificate in the future, simply run Let's Encrypt again. *Your account credentials have been saved in your Let's Encrypt configuration directory at /etc/letsencrypt. You should make a secure backup of this folder now. This configuration directory will also contain certificates and private keys obtained by Let's *Encrypt so making regular backups of this folder is ideal. *If like Let's Encrypt, please consider supporting our work by: Donating to ISRG / Let's Encrypt: https://letsencrypt.org/donate Donating to EFF: https://eff.org/donate-le If everything went well according to the plan, within the /etc/letsencrypt/live/`your_domain_name`, you should have all the certs within this directory: - fullchain.pem - privkey.pem - chain.pem - cert.pem Now it's time to configure the Nginx's site server. # Step 3: Configure the Nginx server ---------- Now that server is properly configured, we can transfer the HTTP get requests to the proper port which is 443. Make sure the port isn't used by any other applications by using the following command: (venv) /home/user$ fuser -v -n tcp 443 Hopefully, the stdout should result in a null printout. We have to add listening parameters to the 443 port and use a 301 return in one server for users who still have the root domain transcribed under a 'https:' root: My conf. file looks like the following: server { client_max_body_size 4M; listen 80 default_server; listen [::]:80 default_server; server_name yourwebsite.com www.yourwebsite.com; return 301 https://$server_name$request_uri; } server { listen 443 ssl http2 default_server; listen[::]:443 ssl http2 default_server; ssl_certificate /etc/letsencrypt/live/yoursite/fullchain.pem; ssl_certificate_key /etc/letsencrypt/live/yoursite/privkey.pem; ssl_session_cache shared:SSL:50m; ssl_stapling_on; ssl_stapling_verify on; ssl_session_timeout 1d; ssl_ciphers: 'you-can-get-your-cipher-at-[mozilla ssl configuration][8]'; ssl_protocols TLSv1.2; ssl_prefer_server_ciphers on; add_header Strict-Transport-Security max-age=15768000; -----------REST-OF-CONFIGURE UNDERNEATH-------- There were some parameters like the `ssl-ciphers` I couldn't hand out because it's unique to me go to [mozilla's page][9] and you can generate ssl params to get a better gist of what to include in your website configuration and if you're operating on a server other than Nginx. # Step 4 (Recommended): Firewall Permissions. ---------- We have to open permission for users to access the 443 port. enable the ufw and add `Nginx Full` & `OpenSSL` to the walls by the following command: (venv)/home/user$ sudo ufw allow 'Nginx Full' (venv)/home/user$ sudo ufw (venv)/home/user$ sudo ufw delete allow 'Nginx HTTP If there are no errors, let's check if our configuration file is good to go by running the following command: (venv) /home/user$ sudo Nginx -t You should receive an output like this: Nginx: the configuration file /etc/Nginx/Nginx.conf syntax is ok Nginx: configuration file /etc/Nginx/Nginx.conf test is successful Restart the forward proxy and the nginx proxy system: (venv) /home/user$ sudo systemctl restart nginx (venv) /home/user$ sudo systemctl restart gunicorn If there are no errors check your website and you should see this rendered (if you're using Google Chrome): ![secure_lock][10] If you don't just write in the comments below and I'll try my best to give you the proper documentation. But we're not done, if you go onto https://www.ssllabs.com/ssltest/analyze.html?d=`your_website.com`, you're website maybe under a B+ rating. We have to configure extra parameters, specifically speaking the dhparam interpreted as the Diffe-Hellman parameter. Why you need to express this within your conf file lord knows, but just do it to get a stronger rating. CD into your /etc/ssl/certs directory and create the key using the following command: (venv)/home/user$ sudo openssl dhparam -out dhparam.pem 4096 This takes about 5-8 minutes so grab some coffee or something. After the key finished compiling add the additional encryption key within the `nginx.conf` file under the ssl_params: > You may also want to generate a snippet file for all these ssl params and `include` it within the server block so the file isn't so cluttered for good measures ... `ssl_dhparam /etc/ssl/certs/dhparam.pem;` ... Once you have the path properly located, check the browser for your rating by entering the url https://www.ssllabs.com/ssltest/analyze.html?`d=yoursite.com`. You should see an A+ grade certification. If you didn't get an A+ rating you may have compressed the dhparam key with a 2048 bit hash, try upgrading it with a 4096 hash from the previous step. # Step 6 (Optional) Renew the website: ---------- For brevity of this article, I've refrained from adding the step to renew the website using the cronjob. The SSL key will expire in 90 days so it's extremely advisable to renew your keys within 60 days. I set a notice within my calendar to update me with renewing my cert. file, but if you're neglectful and would like to automate the process follow Jeff Bradley's site on automating the [letscrypt reissuance][11]. Bear in mind his certs are generated using the webroot plugin whereas this tutorial used the standalone option so proceed depending on how your environment is structured. # What's Next: ---------- And now you're secured! Keep in mind, if you're uploading media files with the header from an unknown source or using scripts that are unsanitized, you'll receive a notice indicating the URI is not secure. Make sure if you have users uploading links or files from another unverified host, they upload HTTPS header documents only. Thank you for reading my post! Akeem Spencer [1]: https://spencertechconsulting.com/media/card_validated.JPG [2]: http://s-dev.twitter.com/validator [3]: https://blog.khophi.co/postgresql-django-nginx-gunicorn-virtualenvwrapper-16-04-lts-ubuntu-server/ [4]: http://http.debian.net/debian/pool/main/p/python-certbot/python-certbot_0.11.1.orig.tar.gz [5]: https://www.digitalocean.com/community/tutorials/how-to-secure-nginx-with-let-s-encrypt-on-ubuntu-16-04/#how-to-use-the-webroot-plugin [6]: https://spencertechconsulting.com/media/cert_email.JPG [7]: https://spencertechconsulting.com/media/domain_names.JPG [8]: https://mozilla.github.io/server-side-tls/ssl-config-generator/ [9]: https://mozilla.github.io/server-side-tls/ssl-config-generator/ [10]: https://spencertechconsulting.com/media/secure_lock.JPG [11]: https://jeffbradberry.com/posts/2016/01/getting-started-with-lets-encrypt/

View


January 2017: A 2-Week Summary Jan. 15, 2017

Author: Akeem Spencer

<br> <br> We are two weeks into the month of the new year and we are bracing ourselves for a transition of power within the judicial and legislative branches. The odds seem tantamount and very finite on what we spend our activities on. Funding is being assessed within hospitals across metropolitan cities and citizens are preparing their tax returns. Let's stop and take a moment to reflect my friends, we must remain vigilant in **everything** we've put our time and effort into. The year 2016 was a miraculous year. We've found breakthroughs in stem cell research, medicinal dependency and addiction for patients have dropped considerably, and renewable energy is beginning to understand the wind, solar, and hydro-generated power emissions and it's amazing implications for carbon reduction over time. Additionally, we've seen a precipitous drop in teenage pregnancies and drug use within the adolescent community. Yet, we still find ourselves focused on mass media news reporting and the sick regulation of how you the consumer should feel/empathize. We must be cognizant of the people we interact with and take into account what this person or organization's true intentions are, good or bad. We.. personally as well as organizationally need to ask ourselves (WIFM) "What's in it for me"? to find the value of our mission. <br> <br> ## **Cybersecurity** ---------- Some fortune 500 companies had a bad year for DDOS attacks and breaches, names I'm sure you've had transactions with and others that you have become aware of, which has led to disarray and fostered fear. For the sanctity of this blog and the fear of backlash I may receive, I've reserved my repudiation on a few of these companies. I remained cautious and optimistic that we can find a solution and that laws will be enacted to prevent data breaches. There is no mandate for you to take on any subscriptions. Is there an added sense of gloom if an account has been hacked? Depends on the attack and the information the hacker retrieved, but whose left responsible the most? **YOU**. You have the power to protect yourself, by making smart, informed decisions on the selection of products you'd like to purchase. If you work for a company that depends on advertisements and sponsorship for funding, most likely you are already aware of the role that you have in data security. The content manager has a credence to serve information in a responsible manner and regulate the user's password at all times, give the user password change notifications and alphanumeric suggestions for the user to select. The architects behind the application understand your intelligence and the team wouldn't sabotage your security or best interests by a long-shot. So what makes sense? We live in an age of attraction and bringing the best content for customers/browsers. So here are some tidbits we are already seeing implemented to respond to "fake" news being reported (whatever that means in verbosity) and tightened security. <br> <br> 1. **User Verification** Anyone can buy a domain and sell their content online, a 10-year old can do that. But social media companies are taking it a step further and creating _account verfication_ modules. If you're a user on YouTube, for instance, you'll usually see a check mark next to your name, indicating you spent a large amount of time posting material that does not violate youtube's publishing act. So your favorite viewer will have it. Now if a video with an account registrant **does not** have any check marks next to their names, you should watch the video with a bit of susceptibility if you don't happen to know the user. <br> The year 2017 should be a year of enforcement for added security measures in that regard, it only makes sense to be extremely meticulous for the content(s) that you browse. Subsequently, if you're following or involved in a thread online, be sure it's validated and coming from a legitimate source, otherwise, the trolls will be lurking and looking to make you a target if you share it with a wrong group of individuals and it becomes labeled as a spread of fake news. This idea does have tradeoffs to take into consideration.(NEEDS FURTHER EDITING) It's always important to focus on communities with a creative commons license. 2. **Ensure HTTPS & SSL encryption** If a domain site does not have a proper authentication, and you're doing credit card transaction from that hosted site, It is imperative for you to be observant of the URL browser exclamation icon. If you're using the chrome browser, click on the exclamation pointer to determine the cookies the site is using to create the awesome content generated on the front of your window screen. Be sure to take into consideration every site that you're searching and visiting. If it's long boot time, flying animations all over the screen, and 1999 windows flashing popups, then maybe you shouldn't give the site the authorization to know your location and access your private information. ## **Know Your Representatives** ---------- I'm able to go onto who are myrepresentatives.org and I'm able to find the following. 1. Charles E. Schumer [United States Senate] 2. Kirsten E. Gillibrand [United States Senate] 3. Andrew M. Cuomo [Governor] 4. Kathy Hochul [Lieutenant Governer] 5. Timothy Idoni [County Clerk] 6. Robert Astorino [County Executive] 7. James A. McCarty [District Attorney] 8. Eric T. Schneiderman [Attorney General] 9. Thomas P. DiNapoli [State Comptroller] 10. Frances Tursi [Commissioner of Jurors] 11. Reginald Lafayette [Elections Commissioner] 12. Douglas A. Colety [Elections Commissioner] 13. Belinda S. Miles [Westchester Community College President] With the clarity of who my elected officials are in the community, it gives me added comfort to see how my community is evolving day by day. How can I live in the USA and not have a comprehension of the constitutional rights one has? By having a partisanship approach to the officials within your bureau and feeling fully justified and understandable to what your tax is accounting for. Let's walk in MLK's footsteps and execute the dream we have within ourselves for a 100% success rate with an amazing throughput!

View


Beyond N=30, how well do we understand data? Dec. 18, 2016

Author: Akeem Spencer

When we begin a sample survey, the following will get you by for conducting a proper survey: <br> <br> 1. Survey Monkey 2. Market To Target Survey 3. Analytical SPSS solver 4. Proposal Summary Based On Analytical 5. Rince Recycle & Repeat

View


Why Linear Mathematics May Be Necessary For Any Business Dec. 11, 2016

Author: Akeem Spencer

Within my close group of friends, we used to challenge the idea of "never" using calculus, linear algebra, and differential equations on a regular basis. We never understood exactly how the technological boom would impact the way we conduct business 2011 forward. We use minimal and evasive tools to perform calculations for budgeting for the year, preparing our tax forms, or just guessing the amount of gratuity we should leave the waiters and waitresses for the meal. But in hindsight, should the average **civilian** have a high threshold for linear algebra and not just basic algebra and arithmetic? <br> <br> <br> When people prepare to fully comprehend computational analytics, some students become stumped at the idea of what linear algebra means. We have two paradigms when we think of linear algebra, the functional notations we see in outdated textbooks preserved in a clear wrapper overpriced at your local university (more on that issue in my next post) or do you see processes working in a contingency where the odds/stakes are LOW risk? We as a nation have to flip the script on how we represent our studies especially when it comes to higher maths. Graphs are exactly what they are, visible charts, but with today's technology graphs can become our strongest ally, and with that graph or diagram we can make gauged assessments on how functions operate. But linearity is a meta-abstract concept to fully dissect. Should charts and diagrams pair with concrete applications? They should at least. <br> <br> At Penn State the course I remember that closest aligned to linear algebra was a 2-hour /2-credit class called matrices. Matrices is an important concept that debugs matrix algebra and linear systems of differential equations i.e. multivariate problems. However, students like myself had a tough time understanding exactly how matricies coalign and work with each other. How do they solve the problems and instances we are binded by? ![MyMathLab][1] ###### What amazon is showing to customers [amazon textbook][2] How many people day-to-day struggle with payments, scheduling, parenting while at the same time thriving in a capitalistic market? One of the three issues grip and subjugate many professionals to derailments and like arithmetic, control and management are vital to successful growth as a business or entity at large. Margins begin to increase solely incumbent upon resources provided and how affiliated you are to the closest resource. Hmm, read that again. Margins begin to increase **soley incumbent upon resources provided**, we have decisions that affect the outcome of the situation or a desirable place we'd like to see accomplished. But with boundaries we have to find an **objective** solution to increase output and efficiency for our clients. <br> <br> So in combination with the tech-boom, everything co-insides with a proportionate amount of trust and stability and without trying to digress too much away from the subject, this is **Linear Programming** in nutshell (O-Reilly don't come after me). <br> So kickoff the year right and remember some questions for consideration if you're a business owner: 1. Are my decisions cost-effective? 2. Should I consolidate a solver? 3. What are the risks if applied? These items detailed above are profoundly important for the business role and acquisition model you wish to mirror. Server RAM and virtual cloud computing is becoming cheaper and cheaper and with that it's crucial to understand how an ERP system can regulate a system for you that's essential to perform well in 'real-time'. Thank you for reading my post! <br> <br> Akeem Spencer [1]: /media/window-to-apple-share/MyMathLab.JPG [2]: https://www.amazon.com/dp/0134040236/ref=olp_product_details?_encoding=UTF8&me=

View