[prev in list] [next in list] [prev in thread] [next in thread] 

List:       npaci-rocks-discussion
Subject:    [Rocks-Discuss] Re: user and database recovery from backups missing /var after a front end MB failur
From:       Robert Kudyba <rkudyba () fordham ! edu>
Date:       2019-06-23 12:24:00
Message-ID: CAFHi+KT6mKEu=4=z9a4PKPP9+zGckCcgo_W-Oq+qULAgzZEc_A () mail ! gmail ! com
[Download RAW message or body]

I fixed the rocks sync config error, the Rocks DB didn't have the correct
SUBNET name, i.e., private, and the missing IP.

After deleting the /home partition I still have these last errors:

Jun 23 08:15:14 puppet httpd: AH00548: NameVirtualHost has no effect and
will be removed in the next release /etc/httpd/conf.d/411.conf:7

Jun 23 08:15:14 puppet systemd: Started The Apache HTTP Server.

Jun 23 08:15:17 puppet rockscommand[27883]: user root called "sync users"

Jun 23 08:15:17 puppet rockscommand[27883]: run
<rocks.commands.sync.users.plugin_googleotp.Plugin instance at 0x1c5e050>

Jun 23 08:15:17 puppet rockscommand[27883]: run
<rocks.commands.sync.users.plugin_fixnewusers.Plugin instance at 0x1c58b00>

Jun 23 08:15:17 puppet rockscommand[27883]: run
<rocks.commands.sync.users.plugin_411.Plugin instance at 0x1c48878>

Jun 23 08:16:43 puppet systemd: Job
dev-mapper-rocks_puppet\x2dhome.device/start timed out.

Jun 23 08:16:43 puppet systemd: Timed out waiting for device
/dev/mapper/rocks_puppet-home.

Jun 23 08:16:43 puppet systemd: Dependency failed for File System Check on
/dev/mapper/rocks_puppet-home.

Jun 23 08:16:43 puppet systemd: Dependency failed for /home.

Jun 23 08:16:43 puppet systemd: Job home.mount/start failed with result
'dependency'.

Jun 23 08:16:43 puppet systemd: Job
systemd-fsck@dev-mapper-rocks_puppet\x2dhome.service/start
failed with result 'dependency'.

Jun 23 08:16:43 puppet systemd: Job
dev-mapper-rocks_puppet\x2dhome.device/start failed with result 'timeout'.

l^C


rocks sync users

make: Entering directory `/var/411'

rm -rf /etc/411.d/*

make

make[1]: Entering directory `/var/411'

/opt/rocks/sbin/411put --comment="#" /etc/auto.smb

Event '411Alert' dispatched! Coalescing enabled: false

411 Wrote: /etc/411.d/etc.auto..smb

Size: 1823/1044 bytes (encrypted/plain)

/opt/rocks/sbin/411put --comment="#" /etc/auto.net

Event '411Alert' dispatched! Coalescing enabled: false

411 Wrote: /etc/411.d/etc.auto..net

Size: 2860/1808 bytes (encrypted/plain)

/opt/rocks/sbin/411put --comment="#" /etc/auto.share

Event '411Alert' dispatched! Coalescing enabled: false

411 Wrote: /etc/411.d/etc.auto..share

Size: 5981/4123 bytes (encrypted/plain)

/opt/rocks/sbin/411put --comment="#" /etc/auto.misc

Event '411Alert' dispatched! Coalescing enabled: false

411 Wrote: /etc/411.d/etc.auto..misc

Size: 1531/829 bytes (encrypted/plain)

/opt/rocks/sbin/411put --comment="#" /etc/auto.home

Event '411Alert' dispatched! Coalescing enabled: false

411 Wrote: /etc/411.d/etc.auto..home

Size: 16583/11974 bytes (encrypted/plain)

/opt/rocks/sbin/411put --comment="#" /etc/auto.master

Event '411Alert' dispatched! Coalescing enabled: false

411 Wrote: /etc/411.d/etc.auto..master

Size: 729/235 bytes (encrypted/plain)

/opt/rocks/sbin/411put --comment="#" /etc/ssh/shosts.equiv

Event '411Alert' dispatched! Coalescing enabled: false

411 Wrote: /etc/411.d/etc.ssh.shosts..equiv

Size: 806/292 bytes (encrypted/plain)

/opt/rocks/sbin/411put --comment="#" /etc/ssh/ssh_known_hosts

Event '411Alert' dispatched! Coalescing enabled: false

411 Wrote: /etc/411.d/etc.ssh.ssh_known_hosts

Size: 3756/2479 bytes (encrypted/plain)

/opt/rocks/sbin/411put --nocomment /etc/passwd

Event '411Alert' dispatched! Coalescing enabled: false

411 Wrote: /etc/411.d/etc.passwd

Size: 13320/9559 bytes (encrypted/plain)

/opt/rocks/sbin/411put --nocomment /etc/group

Event '411Alert' dispatched! Coalescing enabled: false

411 Wrote: /etc/411.d/etc.group

Size: 13981/10046 bytes (encrypted/plain)

/opt/rocks/sbin/411put --nocomment /etc/shadow

Event '411Alert' dispatched! Coalescing enabled: false

411 Wrote: /etc/411.d/etc.shadow

Size: 12846/9203 bytes (encrypted/plain)

make[1]: Leaving directory `/var/411'

make: Leaving directory `/var/411'

I've seen the discussions to make sure httpd is running and setting
HttpProtocolOptions
Unsafe, no difference.

On Sun, Jun 23, 2019 at 12:01 AM Robert Kudyba <rkudyba@fordham.edu> wrote:

> I sorted out the partition issues, just gave all space from /home to /.
> 
> Not a different error
> 
> rocks sync config
> 
> Traceback (most recent call last):
> 
> File "/opt/rocks/bin/rocks", line 260, in <module>
> 
> command.runWrapper(name, args[i:])
> 
> File
> "/opt/rocks/lib/python2.7/site-packages/rocks/commands/__init__.py", line
> 1942, in runWrapper
> 
> self.run(self._params, self._args)
> 
> File
> "/opt/rocks/lib/python2.7/site-packages/rocks/commands/report/host/dhcpd/__init__.py",
>  line 416, in run
> 
> self.writeDhcpDotConf(hosts)
> 
> File
> "/opt/rocks/lib/python2.7/site-packages/rocks/commands/report/host/dhcpd/__init__.py",
>  line 360, in writeDhcpDotConf
> 
> node.ip = privateIP
> 
> UnboundLocalError: local variable 'privateIP' referenced before assignment
> 
> 
> rocks list host interface puppet
> 
> SUBNET IFACE    MAC               IP          NETMASK       MODULE NAME
> VLAN OPTIONS CHANNEL
> 
> public enp2s0f1 00:25:90:DC:92:A9 10.10.5.100 255.255.255.0 ------ puppet
> ---- ------- -------
> 
> ------ enp2s0f0 00:25:90:DC:92:A8 ----------- ------------- ------ ------
> ---- ------- -------
> 
> [root@puppet ~]# rocks list host membership
> 
> Which led me to this thread:
> https://lists.sdsc.edu/pipermail/npaci-rocks-discussion/2012-July/058796.html
> 
> Is that an issue not having a label for the private interface?
> 
> 
> On Sat, Jun 22, 2019 at 11:25 PM Robert Kudyba <rkudyba@fordham.edu>
> wrote:
> 
> > What files would I need? I copied from /etc passwd group shadow gshadow
> > and auto.home as well as the SSH keys?
> > 
> > Do you see a way for me to get Rocks 7 working with just the users?
> > Perhaps I'm missing a(some) file(s)?
> > 
> > On Sat, Jun 22, 2019, 11:19 PM Carlson, Timothy S <
> > Timothy.Carlson@pnnl.gov> wrote:
> > 
> > > That would be my guess.  Some Rocks python call that is looking for a
> > > specific thing in the database that isn't there.
> > > 
> > > -----Original Message-----
> > > From: npaci-rocks-discussion-bounces@sdsc.edu <
> > > npaci-rocks-discussion-bounces@sdsc.edu> On Behalf Of Robert Kudyba
> > > Sent: Saturday, June 22, 2019 8:08 PM
> > > To: Discussion of Rocks Clusters <npaci-rocks-discussion@sdsc.edu>
> > > Subject: [Rocks-Discuss] Re: user and database recovery from backups
> > > missing /var after a front end MB failure
> > > 
> > > Got it,  but what about those errors with rocks sync config? Is that a
> > > Rocks 7 vs 6.2 issue?
> > > 
> > > On Sat, Jun 22, 2019, 11:05 PM Carlson, Timothy S <
> > > Timothy.Carlson@pnnl.gov>
> > > wrote:
> > > 
> > > > The permission denied error for root is because /home is supposed to
> > > > be an automount point and you can't touch anything inside an automount
> > > point.
> > > > 
> > > > All your users id's and associated UIDs are in /etc/passwd. They have
> > > > nothing to do with the rocks database.
> > > > 
> > > > -----Original Message-----
> > > > From: npaci-rocks-discussion-bounces@sdsc.edu <
> > > > npaci-rocks-discussion-bounces@sdsc.edu> On Behalf Of Robert Kudyba
> > > > Sent: Saturday, June 22, 2019 7:21 PM
> > > > To: Discussion of Rocks Clusters <npaci-rocks-discussion@sdsc.edu>
> > > > Subject: [Rocks-Discuss] Re: user and database recovery from backups
> > > > missing /var after a front end MB failure
> > > > 
> > > > So how can I just get the users and their original permissions and
> > > id's in?
> > > > I don't reallly care about the database structure just making sure the
> > > > users can log in as before. I did not find the SQL backup either.
> > > > 
> > > > On Sat, Jun 22, 2019, 10:11 PM Carlson, Timothy S <
> > > > Timothy.Carlson@pnnl.gov>
> > > > wrote:
> > > > 
> > > > > Taking a database directly from 6.2 and putting into a 7.x
> > > > > distribution is unlikely to work. There are changes that get made
> > > > > between distributions that make doing a "dump mysql from 6.2 and
> > > > > push it into 7.0" unlikely to succeed.
> > > > > 
> > > > > -----Original Message-----
> > > > > From: npaci-rocks-discussion-bounces@sdsc.edu <
> > > > > npaci-rocks-discussion-bounces@sdsc.edu> On Behalf Of Robert Kudyba
> > > > > Sent: Saturday, June 22, 2019 6:25 PM
> > > > > To: Discussion of Rocks Clusters <npaci-rocks-discussion@sdsc.edu>
> > > > > Subject: [Rocks-Discuss] Re: user and database recovery from backups
> > > > > missing /var after a front end MB failure
> > > > > 
> > > > > There are clearly some differences from our Rocks 6.2 vs the default
> > > > > Rocks
> > > > > 7 as these rocks commands have the following errors and a permission
> > > > > denied in /home:
> > > > > 
> > > > > [root@puppet home]# ls -l /
> > > > > 
> > > > > total 88
> > > > > 
> > > > > lrwxrwxrwx    1 root root     7 Jun 21 23:42 bin -> usr/bin
> > > > > 
> > > > > dr-xr-xr-x.   7 root root  4096 Jun 22 00:34 boot
> > > > > 
> > > > > drwxr-xr-x   20 root root  3560 Jun 22 00:33 dev
> > > > > 
> > > > > drwxr-xr-x. 159 root root 12288 Jun 22 21:21 etc
> > > > > 
> > > > > lrwxrwxrwx.   1 root root    16 Jun  3 17:12 export ->
> > > state/partition1
> > > > > 
> > > > > drwxrwxrwx    2 root root     0 Jun 22 20:58 home
> > > > > 
> > > > > lrwxrwxrwx    1 root root     7 Jun 21 23:42 lib -> usr/lib
> > > > > 
> > > > > lrwxrwxrwx    1 root root     9 Jun 21 23:42 lib64 -> usr/lib64
> > > > > 
> > > > > drwx------.   2 root root 16384 Jun  3 17:12 lost+found
> > > > > 
> > > > > drwxr-xr-x.   2 root root  4096 Apr 11  2018 media
> > > > > 
> > > > > drwxr-xr-x.   5 root root  4096 Apr 11  2018 mnt
> > > > > 
> > > > > drwxr-xr-x.  10 root root  4096 Apr 11  2018 opt
> > > > > 
> > > > > dr-xr-xr-x  370 root root     0 Jun 22 00:33 proc
> > > > > 
> > > > > dr-xr-x---.  18 root root  4096 Jun 22 21:20 root
> > > > > 
> > > > > drwxr-xr-x   43 root root  1360 Jun 22 20:58 run
> > > > > 
> > > > > lrwxrwxrwx    1 root root     8 Jun 21 23:42 sbin -> usr/sbin
> > > > > 
> > > > > drwxr-xr-x    2 root root     0 Jun 22 20:58 share
> > > > > 
> > > > > drwxr-xr-x.   2 root root  4096 Apr 11  2018 srv
> > > > > 
> > > > > drwxr-xr-x.   3 root root  4096 Jun  3 17:12 state
> > > > > 
> > > > > dr-xr-xr-x   13 root root     0 Jun 22 20:54 sys
> > > > > 
> > > > > drwxr-xr-x.   3 root root  4096 Jun  3 18:14 tftpboot
> > > > > 
> > > > > drwxrwxrwt.  17 root root 20480 Jun 22 21:21 tmp
> > > > > 
> > > > > drwxr-xr-x.  13 root root  4096 Jun 21 23:42 usr
> > > > > 
> > > > > drwxr-xr-x.  27 root root  4096 Jun 21 23:42 var
> > > > > 
> > > > > [root@puppet home]# touch test
> > > > > 
> > > > > touch: cannot touch 'test': Permission denied
> > > > > 
> > > > > make -C /var/411
> > > > > 
> > > > > make: Entering directory `/var/411'
> > > > > 
> > > > > /opt/rocks/sbin/411put --comment="#" /etc/auto.home
> > > > > 
> > > > > Warning: 'NoneType' object is not iterable
> > > > > 
> > > > > Alert message will not be sent.
> > > > > 
> > > > > 411 Wrote: /etc/411.d/etc.auto..home
> > > > > 
> > > > > Size: 10953/7806 bytes (encrypted/plain)
> > > > > 
> > > > > make: Leaving directory `/var/411'
> > > > > 
> > > > > rocks sync users
> > > > > 
> > > > > make: Entering directory `/var/411'
> > > > > 
> > > > > rm -rf /etc/411.d/*
> > > > > 
> > > > > make
> > > > > 
> > > > > make[1]: Entering directory `/var/411'
> > > > > 
> > > > > /opt/rocks/sbin/411put --comment="#" /etc/auto.smb
> > > > > 
> > > > > Warning: 'NoneType' object is not iterable
> > > > > 
> > > > > Alert message will not be sent.
> > > > > 
> > > > > 411 Wrote: /etc/411.d/etc.auto..smb
> > > > > 
> > > > > Size: 1823/1044 bytes (encrypted/plain)
> > > > > 
> > > > > /opt/rocks/sbin/411put --comment="#" /etc/auto.net
> > > > > 
> > > > > Warning: 'NoneType' object is not iterable
> > > > > 
> > > > > Alert message will not be sent.
> > > > > 
> > > > > 411 Wrote: /etc/411.d/etc.auto..net
> > > > > 
> > > > > Size: 2860/1808 bytes (encrypted/plain)
> > > > > 
> > > > > /opt/rocks/sbin/411put --comment="#" /etc/auto.share
> > > > > 
> > > > > Warning: 'NoneType' object is not iterable
> > > > > 
> > > > > Alert message will not be sent.
> > > > > 
> > > > > 411 Wrote: /etc/411.d/etc.auto..share
> > > > > 
> > > > > Size: 5981/4123 bytes (encrypted/plain)
> > > > > 
> > > > > /opt/rocks/sbin/411put --comment="#" /etc/auto.misc
> > > > > 
> > > > > Warning: 'NoneType' object is not iterable
> > > > > 
> > > > > Alert message will not be sent.
> > > > > 
> > > > > 411 Wrote: /etc/411.d/etc.auto..misc
> > > > > 
> > > > > Size: 1531/829 bytes (encrypted/plain)
> > > > > 
> > > > > /opt/rocks/sbin/411put --comment="#" /etc/auto.home
> > > > > 
> > > > > Warning: 'NoneType' object is not iterable
> > > > > 
> > > > > Alert message will not be sent.
> > > > > 
> > > > > 411 Wrote: /etc/411.d/etc.auto..home
> > > > > 
> > > > > Size: 10953/7806 bytes (encrypted/plain)
> > > > > 
> > > > > /opt/rocks/sbin/411put --comment="#" /etc/auto.master
> > > > > 
> > > > > Warning: 'NoneType' object is not iterable
> > > > > 
> > > > > Alert message will not be sent.
> > > > > 
> > > > > 411 Wrote: /etc/411.d/etc.auto..master
> > > > > 
> > > > > Size: 729/235 bytes (encrypted/plain)
> > > > > 
> > > > > /opt/rocks/sbin/411put --comment="#" /etc/ssh/shosts.equiv
> > > > > 
> > > > > Warning: 'NoneType' object is not iterable
> > > > > 
> > > > > Alert message will not be sent.
> > > > > 
> > > > > 411 Wrote: /etc/411.d/etc.ssh.shosts..equiv
> > > > > 
> > > > > Size: 794/280 bytes (encrypted/plain)
> > > > > 
> > > > > /opt/rocks/sbin/411put --comment="#" /etc/ssh/ssh_known_hosts
> > > > > 
> > > > > Warning: 'NoneType' object is not iterable
> > > > > 
> > > > > Alert message will not be sent.
> > > > > 
> > > > > 411 Wrote: /etc/411.d/etc.ssh.ssh_known_hosts
> > > > > 
> > > > > Size: 2342/1431 bytes (encrypted/plain)
> > > > > 
> > > > > /opt/rocks/sbin/411put --nocomment /etc/passwd
> > > > > 
> > > > > Warning: 'NoneType' object is not iterable
> > > > > 
> > > > > Alert message will not be sent.
> > > > > 
> > > > > 411 Wrote: /etc/411.d/etc.passwd
> > > > > 
> > > > > Size: 13320/9559 bytes (encrypted/plain)
> > > > > 
> > > > > /opt/rocks/sbin/411put --nocomment /etc/group
> > > > > 
> > > > > Warning: 'NoneType' object is not iterable
> > > > > 
> > > > > Alert message will not be sent.
> > > > > 
> > > > > 411 Wrote: /etc/411.d/etc.group
> > > > > 
> > > > > Size: 13956/10026 bytes (encrypted/plain)
> > > > > 
> > > > > /opt/rocks/sbin/411put --nocomment /etc/shadow
> > > > > 
> > > > > Warning: 'NoneType' object is not iterable
> > > > > 
> > > > > Alert message will not be sent.
> > > > > 
> > > > > 411 Wrote: /etc/411.d/etc.shadow
> > > > > 
> > > > > Size: 12846/9203 bytes (encrypted/plain)
> > > > > 
> > > > > make[1]: Leaving directory `/var/411'
> > > > > 
> > > > > make: Leaving directory `/var/411'
> > > > > 
> > > > > 
> > > > > 
> > > > > [root@puppet home]# rocks sync config
> > > > > 
> > > > > Traceback (most recent call last):
> > > > > 
> > > > > File "/opt/rocks/bin/rocks", line 260, in <module>
> > > > > 
> > > > > command.runWrapper(name, args[i:])
> > > > > 
> > > > > File
> > > > > "/opt/rocks/lib/python2.7/site-packages/rocks/commands/__init__.py",
> > > > > line 1942, in runWrapper
> > > > > 
> > > > > self.run(self._params, self._args)
> > > > > 
> > > > > File
> > > > > 
> > > > > "/opt/rocks/lib/python2.7/site-packages/rocks/commands/report/host/d
> > > > > hc
> > > > > pd/__init__.py",
> > > > > line 416, in run
> > > > > 
> > > > > self.writeDhcpDotConf(hosts)
> > > > > 
> > > > > File
> > > > > 
> > > > > "/opt/rocks/lib/python2.7/site-packages/rocks/commands/report/host/d
> > > > > hc
> > > > > pd/__init__.py",
> > > > > line 312, in writeDhcpDotConf
> > > > > 
> > > > > self.makeAttrDictionary()
> > > > > 
> > > > > File
> > > > > 
> > > > > "/opt/rocks/lib/python2.7/site-packages/rocks/commands/report/host/d
> > > > > hc
> > > > > pd/__init__.py",
> > > > > line 219, in makeAttrDictionary
> > > > > 
> > > > > for row in self.db.fetchall():
> > > > > 
> > > > > File
> > > > > "/opt/rocks/lib/python2.7/site-packages/rocks/commands/__init__.py",
> > > > > line 1183, in fetchall
> > > > > 
> > > > > return self.database.fetchall()
> > > > > 
> > > > > File
> > > > > "/opt/rocks/lib/python2.7/site-packages/rocks/db/database.py",
> > > > > line 329, in fetchall
> > > > > 
> > > > > return self.results.fetchall()
> > > > > 
> > > > > File
> > > > > "/opt/rocks/lib/python2.7/site-packages/sqlalchemy/engine/result.py"
> > > > > ,
> > > > > line 788, in fetchall
> > > > > 
> > > > > self.cursor, self.context)
> > > > > 
> > > > > File
> > > > > "/opt/rocks/lib/python2.7/site-packages/sqlalchemy/engine/base.py",
> > > > > line 1079, in _handle_dbapi_exception
> > > > > 
> > > > > util.reraise(*exc_info)
> > > > > 
> > > > > File
> > > > > "/opt/rocks/lib/python2.7/site-packages/sqlalchemy/engine/result.py"
> > > > > ,
> > > > > line 782, in fetchall
> > > > > 
> > > > > l = self.process_rows(self._fetchall_impl())
> > > > > 
> > > > > File
> > > > > "/opt/rocks/lib/python2.7/site-packages/sqlalchemy/engine/result.py"
> > > > > ,
> > > > > line 751, in _fetchall_impl
> > > > > 
> > > > > self._non_result()
> > > > > 
> > > > > File
> > > > > "/opt/rocks/lib/python2.7/site-packages/sqlalchemy/engine/result.py"
> > > > > ,
> > > > > line 756, in _non_result
> > > > > 
> > > > > "This result object does not return rows. "
> > > > > 
> > > > > sqlalchemy.exc.ResourceClosedError: This result object does not
> > > > > return rows. It has been closed automatically.
> > > > > 
> > > > > 
> > > > > 
> > > > > On Sat, Jun 22, 2019 at 1:48 PM Carlson, Timothy S <
> > > > > Timothy.Carlson@pnnl.gov>
> > > > > wrote:
> > > > > 
> > > > > > Just copy from backup.
> > > > > > 
> > > > > > -----Original Message-----
> > > > > > From: npaci-rocks-discussion-bounces@sdsc.edu <
> > > > > > npaci-rocks-discussion-bounces@sdsc.edu> On Behalf Of Robert
> > > > > > Kudyba
> > > > > > Sent: Saturday, June 22, 2019 10:08 AM
> > > > > > To: Discussion of Rocks Clusters <npaci-rocks-discussion@sdsc.edu>
> > > > > > Subject: [Rocks-Discuss] Re: user and database recovery from
> > > > > > backups missing /var after a front end MB failure
> > > > > > 
> > > > > > Great, what about /etc/passwd and /etc/shadow?
> > > > > > 
> > > > > > On Sat, Jun 22, 2019 at 12:57 PM Carlson, Timothy S <
> > > > > > Timothy.Carlson@pnnl.gov> wrote:
> > > > > > 
> > > > > > > Just use /etc/auto.home from backup.   The syntax error was due
> > > > outlook
> > > > > > > collapsing my line breaks.
> > > > > > > 
> > > > > > > -----Original Message-----
> > > > > > > From: npaci-rocks-discussion-bounces@sdsc.edu <
> > > > > > > npaci-rocks-discussion-bounces@sdsc.edu> On Behalf Of Robert
> > > > > > > Kudyba
> > > > > > > Sent: Saturday, June 22, 2019 9:42 AM
> > > > > > > To: Discussion of Rocks Clusters
> > > > > > > <npaci-rocks-discussion@sdsc.edu>
> > > > > > > Subject: [Rocks-Discuss] Re: user and database recovery from
> > > > > > > backups missing /var after a front end MB failure
> > > > > > > 
> > > > > > > I also have /etc/auto.home from backups would restoring that
> > > > > > > work with the make command?
> > > > > > > 
> > > > > > > I'm getting this syntax error:
> > > > > > > 
> > > > > > > for x in ` cat passwd | awk -F: '$3>1000 {print $1}'` do echo
> > > > > > > "$x puppet.local:/export/home/$x" >> /etc/auto.home done
> > > > > > > 
> > > > > > > -bash: syntax error near unexpected token `>>'
> > > > > > > 
> > > > > > > 
> > > > > > > On Sat, Jun 22, 2019 at 12:34 PM Carlson, Timothy S <
> > > > > > > Timothy.Carlson@pnnl.gov> wrote:
> > > > > > > 
> > > > > > > > And you can thank outlook for that crappy formatting. The
> > > > > > > > logic is correct but missing a pile of semicolons or line
> > > > > > > > breaks 😊
> > > > > > > > 
> > > > > > > > -----Original Message-----
> > > > > > > > From: npaci-rocks-discussion-bounces@sdsc.edu <
> > > > > > > > npaci-rocks-discussion-bounces@sdsc.edu> On Behalf Of
> > > Carlson,
> > > > > > > > Timothy S
> > > > > > > > Sent: Saturday, June 22, 2019 9:27 AM
> > > > > > > > To: Discussion of Rocks Clusters
> > > > > > > > <npaci-rocks-discussion@sdsc.edu>
> > > > > > > > Subject: [Rocks-Discuss] Re: user and database recovery from
> > > > > > > > backups missing /var after a front end MB failure
> > > > > > > > 
> > > > > > > > The users are not in the database.
> > > > > > > > 
> > > > > > > > Do you have /etc/passwd from backup? If so then all you have
> > > > > > > > to do is reconstruct /etc/auto.home which is an "awk" exercise.
> > > > > > > > Something like this
> > > > > > > > 
> > > > > > > > for x in ` cat /etc/passwd | awk -F: '$3>1000 {print $1}'` do
> > > > > > > > echo "$x myclustername.local:/export/home/$x" >>
> > > > > > > > /etc/auto.home done
> > > > > > > > 
> > > > > > > > make -C /var/411
> > > > > > > > 
> > > > > > > > 
> > > > > > > > -----Original Message-----
> > > > > > > > From: npaci-rocks-discussion-bounces@sdsc.edu <
> > > > > > > > npaci-rocks-discussion-bounces@sdsc.edu> On Behalf Of Robert
> > > > > > > > Kudyba
> > > > > > > > Sent: Saturday, June 22, 2019 9:12 AM
> > > > > > > > To: Discussion of Rocks Clusters
> > > > > > > > <npaci-rocks-discussion@sdsc.edu>
> > > > > > > > Subject: [Rocks-Discuss] Re: user and database recovery from
> > > > > > > > backups missing /var after a front end MB failure
> > > > > > > > 
> > > > > > > > Thanks,  I can't find the file in the backups as my
> > > > > > > > predecessor didn't add /var to the rsnapshots.
> > > > > > > > 
> > > > > > > > Is there a way to start from scratch but still retain the
> > > > > > > > users and their id's from the /etc files? I have all of
> > > /export.
> > > > > > > > 
> > > > > > > > On Sat, Jun 22, 2019, 10:14 AM Carlson, Timothy S <
> > > > > > > > Timothy.Carlson@pnnl.gov>
> > > > > > > > wrote:
> > > > > > > > 
> > > > > > > > > A rocks cluster creates a backup each day for the database
> > > > > > > > > and it lives as a flat txt file in /var/db/
> > > > > > > > > mysql-backup-cluster
> > > > > > > > > 
> > > > > > > > > This is not the output of "rocks dump". It is a complete
> > > > > > > > > backup of the "cluster" database within mysql.
> > > > > > > > > 
> > > > > > > > > 
> > > > > > > > > -----Original Message-----
> > > > > > > > > From: npaci-rocks-discussion-bounces@sdsc.edu <
> > > > > > > > > npaci-rocks-discussion-bounces@sdsc.edu> On Behalf Of
> > > Robert
> > > > > > > > > Kudyba
> > > > > > > > > Sent: Saturday, June 22, 2019 6:39 AM
> > > > > > > > > To: Discussion of Rocks Clusters
> > > > > > > > > <npaci-rocks-discussion@sdsc.edu>
> > > > > > > > > Subject: [Rocks-Discuss] Re: user and database recovery from
> > > > > > > > > backups missing /var after a front end MB failure
> > > > > > > > > 
> > > > > > > > > > 
> > > > > > > > > > # rocks dump
> > > > > > > > > > 
> > > > > > > > > > ...somewhere in a filesystem backup that you have access
> > > to.
> > > > > > > > > > 
> > > > > > > > > 
> > > > > > > > > 
> > > > > > > > > Is there any grep I can do like find hourly.* -name
> > > > > > > > > '*Kickstart_PrivateNetmask*' -print?
> > > > > > > > > 
> > > > > > > > > > 
> > > > > > > > > > I think that without either a copy of the DB, the rocks
> > > > > > > > > > dump output or other mechanism by which all configuration
> > > > > > > > > > changes applied to the system and stored in the DB were
> > > > > > > > > > recorded you either need to wait or start over if your
> > > > > > > > > > entire frontend (including disks) is out for
> > > > > > > > service.
> > > > > > > > > > 
> > > > > > > > > 
> > > > > > > > > The entire server with disks is out for service.
> > > > > > > > > 
> > > > > > > > > By start over, can we at least keep the user and group ID's
> > > > > > > > > to match from /etc/group & /etc/passwd?
> > > > > > > > > -------------- next part -------------- An HTML attachment
> > > > > > > > > was scrubbed...
> > > > > > > > > URL:
> > > > > > > > > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.sd
> > > > > > > > > sc
> > > > > > > > > .e
> > > > > > > > > du
> > > > > > > > > _p
> > > > > > > > > ip
> > > > > > > > > ermail_npaci-2Drocks-2Ddiscussion_attachments_20190622_1bae6
> > > > > > > > > 7d
> > > > > > > > > 6_
> > > > > > > > > at
> > > > > > > > > ta
> > > > > > > > > ch
> > > > > > > > > ment.html&d=DwIDaQ&c=aqMfXOEvEJQh2iQMCb7Wy8l0sPnURkcqADc2guU
> > > > > > > > > W8
> > > > > > > > > IM
> > > > > > > > > &r
> > > > > > > > > =X
> > > > > > > > > 0j
> > > > > > > > > L9y0sL4r4iU_qVtR3lLNo4tOL1ry_m7-psV3GejY&m=K3ILrcJ9om69rkw4Y
> > > > > > > > > Kf
> > > > > > > > > yN
> > > > > > > > > mq
> > > > > > > > > eK ID
> > > > > > > > > jwOoM7sEcodXxNfo&s=thxAD__troKCaXlZv4Iw_dlw49qJXmhMU90ELRmGj
> > > > > > > > > nk
> > > > > > > > > &e
> > > > > > > > > =
> > > > > > > > > 
> > > > > > > > > 
> > > > > > > > > 
> > > > > > > > -------------- next part -------------- An HTML attachment was
> > > > > > > > scrubbed...
> > > > > > > > URL:
> > > > > > > > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.sdsc
> > > > > > > > .e
> > > > > > > > du
> > > > > > > > _p
> > > > > > > > ip
> > > > > > > > ermail_npaci-2Drocks-2Ddiscussion_attachments_20190622_e73b4f8
> > > > > > > > c_
> > > > > > > > at
> > > > > > > > ta
> > > > > > > > ch
> > > > > > > > ment.html&d=DwIDaQ&c=aqMfXOEvEJQh2iQMCb7Wy8l0sPnURkcqADc2guUW8
> > > > > > > > IM
> > > > > > > > &r
> > > > > > > > =X
> > > > > > > > 0j
> > > > > > > > L9y0sL4r4iU_qVtR3lLNo4tOL1ry_m7-psV3GejY&m=HX7f4tp-lDQKPbV4L5k
> > > > > > > > ml
> > > > > > > > D5
> > > > > > > > VV xA
> > > > > > > > u-hv5qFGqzG8vvFE&s=oJ-dPl_cE7notylM2kbVj28QMDT3c24Q7K5cz5Dg5iU
> > > > > > > > &e
> > > > > > > > =
> > > > > > > > 
> > > > > > > > 
> > > > > > > > 
> > > > > > > > 
> > > > > > > -------------- next part -------------- An HTML attachment was
> > > > > > > scrubbed...
> > > > > > > URL:
> > > > > > > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.sdsc.e
> > > > > > > du
> > > > > > > _p
> > > > > > > ip
> > > > > > > ermail_npaci-2Drocks-2Ddiscussion_attachments_20190622_7f24d153_
> > > > > > > at
> > > > > > > ta
> > > > > > > ch
> > > > > > > ment.html&d=DwIDaQ&c=aqMfXOEvEJQh2iQMCb7Wy8l0sPnURkcqADc2guUW8IM
> > > > > > > &r
> > > > > > > =X
> > > > > > > 0j
> > > > > > > L9y0sL4r4iU_qVtR3lLNo4tOL1ry_m7-psV3GejY&m=0HCTWYyKCpRfMJU-35_W4
> > > > > > > bZ
> > > > > > > Rx lW
> > > > > > > qD3yJqF2Q2ZiW5ok&s=CN5p9ZNmvg7Gz5b9oqAEp1R7j9z13c53KEOHJRY12n8&e
> > > > > > > =
> > > > > > > 
> > > > > > > 
> > > > > > > 
> > > > > > -------------- next part -------------- An HTML attachment was
> > > > > > scrubbed...
> > > > > > URL:
> > > > > > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.sdsc.edu
> > > > > > _p
> > > > > > ip
> > > > > > ermail_npaci-2Drocks-2Ddiscussion_attachments_20190622_62a328be_at
> > > > > > ta
> > > > > > ch
> > > > > > ment.html&d=DwIDaQ&c=aqMfXOEvEJQh2iQMCb7Wy8l0sPnURkcqADc2guUW8IM&r
> > > > > > =X
> > > > > > 0j
> > > > > > L9y0sL4r4iU_qVtR3lLNo4tOL1ry_m7-psV3GejY&m=4ArI69qGGFkhByl2IJPdC8C
> > > > > > NA Qu
> > > > > > GkdOQJ25Exor8WKo&s=aZ6XQQqi2MM6_7c4jyLrNinbjbQK-6-Yub6keM5VV70&e=
> > > > > > 
> > > > > > 
> > > > > > 
> > > > > -------------- next part -------------- An HTML attachment was
> > > > > scrubbed...
> > > > > URL:
> > > > > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.sdsc.edu_p
> > > > > ip
> > > > > ermail_npaci-2Drocks-2Ddiscussion_attachments_20190622_a5b84e4e_atta
> > > > > ch
> > > > > ment.html&d=DwIDaQ&c=aqMfXOEvEJQh2iQMCb7Wy8l0sPnURkcqADc2guUW8IM&r=X
> > > > > 0j
> > > > > L9y0sL4r4iU_qVtR3lLNo4tOL1ry_m7-psV3GejY&m=SpfjiGanHyFgJXMuI-GFlJB1K
> > > > > hu 1NsZYBpgBnb4iF5A&s=Wqh-rRzYoDqP8Db_GXgVTE3ZpkP5dxJttxUqll5eBmo&e=
> > > > > 
> > > > > 
> > > > > 
> > > > -------------- next part -------------- An HTML attachment was
> > > > scrubbed...
> > > > URL:
> > > > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.sdsc.edu_pip
> > > > ermail_npaci-2Drocks-2Ddiscussion_attachments_20190622_b254fc1e_attach
> > > > ment.html&d=DwIDaQ&c=aqMfXOEvEJQh2iQMCb7Wy8l0sPnURkcqADc2guUW8IM&r=X0j
> > > > L9y0sL4r4iU_qVtR3lLNo4tOL1ry_m7-psV3GejY&m=5B8sh3laF2FEiMcefkpnSYesO8M
> > > > Hcnv02OoqH5k8Dqw&s=nOR6Tm2jR3eP2Np4RYIf889MHm4N-OVXh-u3JYOuwt0&e=
> > > > 
> > > > 
> > > > 
> > > -------------- next part --------------
> > > An HTML attachment was scrubbed...
> > > URL:
> > > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.sdsc.edu_pipermail_npa \
> > > ci-2Drocks-2Ddiscussion_attachments_20190622_69b4d004_attachment.html&d=DwIDaQ&c \
> > > =aqMfXOEvEJQh2iQMCb7Wy8l0sPnURkcqADc2guUW8IM&r=X0jL9y0sL4r4iU_qVtR3lLNo4tOL1ry_m \
> > > 7-psV3GejY&m=dhcawZVgh1fDKuy0NoDiVoXuhxHMYnHC1ygDOF0Q0v8&s=iLogPlX8JWc4Z2Uvf0zd7Ud3EpLo2_EgUu0wLNmVU8w&e=
> > >  
> > > 
> > > 
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.sdsc.edu/pipermail/npaci-rocks-discussion/attachments/20190623/79ed80d5/attachment.html \



[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic