You are hereTechnote
When you log into a server with Ubuntu at starting around version 10, you get a nifty message about what the system is doing. I needed to see what the current status, but I didn't know how that was generated.
So, if you want those statistics again, you can get them anytime without logging in by...
When supporting a linux box, it can be handy to know exactly how much memory a process and all its children are taking up.
I create this handy script when I need to know that...
I have to re-figure this out every time I need to do it, every few years, so I wrote this to remind me.
Once in a while, I have to change the IP address of a remote network. For instance, Comcast came into a customer of mine and replaced the modem/router. When they did, they put in their own local IP network number and broke the ability to get to any hard-coded IP addresses on the network (like printers). They used to be a 192.168.0.x network and now they were a 10.1.10.x network.
I could change all the hard coded devices to the new IP address scheme OR I could fix the network. It is a lot easier to change the one network than all the devices (on devices I can't easily reach because they are on another network, now, so I'd have to set up temporary IP routes and etc. etc. etc...).
The problem with changing the IP address on a router remotely is that you won't have access to the computer you remote in on after the change because it is still on the wrong network until a reboot.
So here's the simple trick to fix it all at once...
I've written before about getting email to send from dumb devices through a Windows Server host to gmail (Google Apps). That works fine in a lot of small offices because there is often a Windows Server sitting around somewhere, even if just for file services.
However, there are times when you DON'T have a Windows Server around and need to send email through a Linux server.
Here's what I do when I want to send email via gmail (Google Apps) and I have an Ubuntu box available...
I had a customer get a virus (why is that still a possibility????) and had to do some remote cleaning. I tried malwarebytes, but it wouldn't finish and kept getting a weird error after scan started running.
I was pretty sure I had everything, but I like to take a few passes at a system that has had issues, if I'm NOT doing a format and re-install, just to be safe.
Trying to make sure all was well, I felt I couldn't give the system back until I had a good, clean scan run, but that was not to be.
After malwarebytes let me down for the first time in a few years, I thought I'd try Comodo Cleaning Essentials, which is a free bit of cleaning kit from Comodo. One thing I like about it is that it doesn't require an install, like malwarebytes. Pretty annoying installing a program in the middle of cleaning up a system. The problem was, I couldn't get the program to run and finish without just disappearing!
Man, I ran into a real stumper. I had some customers that had been using Google Apps for email for quite awhile. They required Outlook instead of just using the web interface, so, inspite of the disadvantages of using it, we needed to use Google Sync for Outlook.
Everything worked great, until about 14 months into using it, a user reported that some of her emails were "missing". When I checked the web interface, the emails were still there, but they were definitely gone from Outlook.
None of the "helpful" auto-archive features were turned on in Outlook. Outlook will spin your older emails out to a separate PST file, if you let it--this makes it "easier" by putting things where they can't be searched for and you have to specifically go looking for them in a place you don't know exists...
Anyway, I tried a number of things like doing a resync in Google Sync, etc., but nothing helped.
Well, when I was making some changes because a user had left the organization, I discovered something pretty important about how Google Sync actually works...
Most people in tech know that you can use Google Analytics to monitor website performance. App developers and website hackers are probably aware that you can monitor your users' use of your apps for your own custom events. However, very few people know that you can use Google Analytics to monitor your own Windows and Linux (and anything else) servers' disk space, performance, and any other metric you want!
One of the best sites I've found for monitoring uptime of my services is Up Time Robot. They do a great job monitoring access to my websites and critical services like SMTP for email and SSH or whatever you like. I can even make sure a certain phrase (like "user login") is appearing on a certain web page! But that only works for internet-facing services.
Many of my cloud service providers (like Rackspace and Amazon, and Digital Ocean) have monitoring services for my servers they are running, but that doesn't help for my client's servers behind the firewall.
After just a little fiddling around, I've come up with an easy way of monitoring key metrics on all of my servers using "Universal" Google Analytics...
The raspberry pi is an awesome little device with an awesome community. Because of what you can do with it and its very low cost, I think it is the current "playground" for future programmers, devops, and other technical professionals. The kids playing with raspberry pi's today are the tech leaders of tomorrow. What I would have done, if I could have built a linux server for $40 when I was a kid! Or even a server FARM! I'm thinking a data center under my twin bed would have been so much more awesome than that yeast experiment gone bad...
I ordered some for my kids as part of their homeschool curriculum and brought the RPi's home. I left them in the shipping box in the kitchen. When I came home that night, both kids had the RPi's running and spent the afternoon learning how to program!
While designed for kids and education, they also make great project boxes and embedded systems. Thousands of software packages available with a simple "apt-get install" command which lets you build a tiny little server capable of anything a big server can do, limited only by 512MB of memory and an overclocked 950mhz processor.
I've deployed a couple RPis as network monitors and remote support boxes and continue to experiment with them.
There is tons of information available about the RPi and getting started with it, so I'm not writing another tutorial. I wanted to document what I had to figure out that was scattered around and took me a while to figure out. Also, some of my best practices.
Here are a couple of recipes for building an RPi useful in these environments...
I have a fairly sophisticated setup for my Sonicwall TZ200. I have 3 internet connections: 1) A traditional T1 @ 1.544mbps, 2) AT&T DSL at 6mbps, and 3) Comcast at 24mbps.
I've played with various load balancing schemes, but what has worked best, until recently, is a simple failover system where all my outbound traffic goes out and comes in via Comcast, my email traffic uses the T1, and the AT&T connection acts as a backup connection.
When I tried some percentage based stuff, it worked, but when some users reported the connections being slow, it was always hard to tell who (which connection) was responsible.
Anyway, things were going swimmingly until just a few weeks ago when users began complaining about connections being really slow.
We had been making some changes recently because the TZ200 had been freezing up and Sonicwall had me redo the entire configuration by hand because of that. That issue turned out to be because we were using the DHCP server in the Sonicwall and that didn't cooperate with our Sonicpoint setup. The Sonicpoint would freeze up and stop shipping traffic for no reason. As soon as we moved the DHCP server to a Windows Server and shut that down, our Sonicpoint problem went away.
(I digress again...)
Anyway, after that, things had gone very well until, suddenly a few weeks ago, the performance on the wired network was just horrible! We are supposed to be getting 24mbps down from the Comcast connection, but we were lucky to get 10% of that. Our ping times were horrible at 500 to 1000 msec instead of the usual 20 or 30 msec.
Well, as sometimes happens, it took a lot of serious investigation to finally figure out what was wrong...
I had a situation where I had to recover a failing Zimbra email server running Network version 6.x. The hardware was failing, so I recovered the system to a new virtual server. There were a lot of things to deal with in moving to a new system, but I had everything back and running in about 12 hours (there was almost a terabyte of messages to move to the new system).
Everything looked good, but when I went back to do a reality check on the system a week later, I found out that the automatic backups weren't running.
It took a little research, but I figured out how to get backups to run again...
Did this help you? You can help me!
Did you find this information helpful? You can help me back by linking to this page, purchasing from my sponsors, or posting a comment!
+One me on Google:
Follow me on twitter: http://twitter.com/mojocode