Expand Linux LVM with free space from Hyper-V

This page is a bit of an update from my last post on expanding Linux volumes.

The scenario is a pre-existing VHDX file that is mounted within a Debian virtual machine, which has used LVM to create volumes.

Now we want to expand that VHDX file, and then extend the pre-existing volumes without spanning new partitions.

  1. Expand the disk in the Hypervisor
  2. Reboot the VM to recognize the change (there’s specific commands to do this as well)
  3. Run parted on your disk: parted /dev/sdb
  4. Change display unit to sectors: unit s
  5. Print current partition table and note the start sector for your partition: p
  6. Delete your partition (won’t delete the data or filesystem): rm <number>
  7. Recreate the partition with the starting sector from step 5 above: mkpart primary <start> -1
  8. Exit parted: quit
  9. Reboot the VM to recognize the new partition size
  10. Type the following to resize the physical volume: pvresize /dev/sdb1
  11. Now you can re-allocate size to a logical volume by using the following (this adds 20G to existing size): lvextend -L+20G /dev/volumegroup1/mylogicalvolume
  12. For any logical volume that you resized, you need to extend the ext4 partition: resize2fs /dev/volumegroup1/mylogicalvolume

Excel slow to open on Windows 10

I’ve been having an issue with Excel 2013 (from the Office 365 Click to Run installer) for a while now that I finally decided to dig a little deeper on.

I found that when I was double clicking on a file from Windows Explorer to open in Excel, it would take 15-30 seconds before the application would appear.

However, if I opened Excel from the start menu, or through a “Run” command, it would appear instantly.

I tried many different things to isolate this issue such as checking conflicting processes, watching processmon, and eliminating the network as a source of problems.

 

Eventually I hit the right combination of google keywords and came across this post.

 

Based on that recommendation I disabled Cortana and immediately saw improved response times from Excel. Hopefully Microsoft fixes this bug in time for Office 365 integration with Cortana!

 

High Resolution Photo in Lync for Office 365

My organization uses Office 365 for Exchange and Lync service, although Lync has recently been set up as on-premise.

For a while now the low-resolution photo of Lync has been bothering me, so I set out trying to find a way to use a high resolution photo instead.

Microsoft allows a 648×648 photo to be stored in an Exchange 2013 mailbox, which is then used for Lync.

To begin, set up your environment to connect to Office 365 with Powershell:

$NPO = New-PSSessionOption -ProxyAccessType IEConfig
$cred = Get-Credential
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionURI https://ps.outlook.com/powershell/?proxymethod=rps -Credential $cred -Authentication Basic -AllowRedirection -SessionOption $NPO
Import-PSSession $Session
Import-Module MSOnline
Connect-MsolService -Credential $cred
  • Using Powershell, run the cmdlet
  • When prompted for credentials, enter your Office 365 administrative credentials
  • Now use the following Powershell commands:
$photo = ([Byte[]] $(Get-Content -Path "C:\Users\jmiles\Desktop\IMG_0067_Lync.jpg" -Encoding Byte -ReadCount 0))
Set-UserPhoto -Identity "Jeff Miles" -PictureData $photo -Confirm:$False
Set-UserPhoto -Identity "Jeff Miles" -Save -Confirm:$False

Replace the photo location and the identity name in the commands above.

 

That’s it! Now your photo should be nice and clear when in a Lync call.

 

Sources:

https://technet.microsoft.com/en-us/library/jj688150.aspx

http://stackoverflow.com/questions/25199254/automated-script-to-change-user-photos-in-microsoft-exchange-2013-powershell/26150403#26150403

https://technet.microsoft.com/en-us/library/jj151815.aspx#bkmk_installmodule

Migrate Mindtouch to Hyper-V

My Mindtouch Core wiki VM was originally running on VMWare server a long time ago. I needed to migrate this to Hyper-V so that I could decommission my use of VMware.

I originally wrote this post more than 2 years ago, but am publishing it now in case someone finds it useful.

 

Used vmdk2vhd to convert the disk to VHD file.

After transferring and booting, it failed.

Used these instructions to assist in fixing: http://itproctology.blogspot.ca/2009/04/migrating-debian-from-vmware-esx-to.html

mount -t ext3 /dev/hda1 /root

vi /root/etc /fstab

change sda1 to hda1

vi /root/boot/grub/menu.lst

change sda1 to hda1

Then added a Legacy Network adapter

Then followed these instructions to install Hyper-V integration services

http://www.r2x2.com/install-hyper-v-integration-services-on-debian-5-x/

DPM 2012 R2 and the downsides

I’ve been using DPM 2012 R2 for a few months now, having replaced Symantec Backup Exec 2010 due to growing data sizes and increased struggles with tape rotations.

However I’ve found a number of deficiencies with DPM that make me wish we were able to implement something like Veeam instead.

Here’s a short summary of what I need DPM to do better:

  • No deduplication support!
  • Disk volume based system leaves ‘islands of storage’ unusable and inefficient
    • Prevents disk from being shared for other backup purposes such as Hyper-V replication
  • Lack of long-term disk backups
    • Our TechNet reading has shown that since DPM uses VSS it can only take a maximum of 64 snapshots for a protected resource. We’re currently unsure if this applies to VMs as a protected resource
  • Poor visibility into DPM running operations
    • No clarity on what the data transfer represents
    • No information on compression ratios
    • No transfer speed indicators
  • No easy way to see status of data across all protected sources
    • No dashboards or easy summaries.
    • Many clicks to drill down into each protection group
  • Poor configurability on logging
    • Email notifications are very chatty, or non-existent without much middle ground.
    • No escalation methods or schedules
  • No automated test restore capabilities or scheduling
  • Limited Reporting
    • Only 6 reports out of the box, and must use SQL Reporting Services to build anything new (which I am adept with, but that’s besides the point)
  • Tape Library support seems cumbersome, compression isn’t work despite it reporting as running
  • No built in VM replication technology for Disaster Recovery scenarios
  • Very low community knowledge or support
    • For example, trying to find information on tape compression is impossible; no one online is talking about DPM and how it’s used.
  • No central console for viewing multiple backup source/destination pairs